@openmastering @falktx @unfa @dump_stack
i figure it totally depends how this will be done and implemented and how buggy it will be. i hope gnome will not be a dependecy etc. i'm wary of anything that's connected to red hat and gnome. pulse audio was a mess for quite a while.
i love separation (and compatibility) between jack (PRO!) and ALSA (average user). on the other hand it would be great if everything would *just work* low latency and talking to each other blahblah. so, i cheer for this!
@openmastering @unfa @dump_stack@lor.sh not sure.. pipewire devs recommend using the same (pulse/jack) APIs we do right now. I guess because not everyone is going to have pipewire installed, but they are going to have pulseaudio and/or jack.
But from a developer's perspective, JACK API is very simple yet allows to do a lot, so I do not see enough of a reason to directly use pipewire APIs since the custom pipewire libjack is staying with us for the long-term.
@falktx
So pipewire would just be a kind of glorified bridge between end user protocols (jack & Pulse audio)?
I really hope that it transforms into a unified thing where users can have it all: low latency, patching, video, levels etc. Adding a complexity layer just in order to bridge services is questionable.
@unfa @dump_stack
@openmastering @unfa @dump_stack@lor.sh hmm well, yes, and yes and yes. have you seen the pipewire post? the screenshot already shows that in action (audio-wise). You can see Chrome in Carla's canvas (running in JACK mode), so we can already take its audio and pass through anything else in the graph.
@falktx
So long term, jack and pulse audio will become obsolete, right? The jack/pulse interfaces are kind of a transition thing?
@unfa @dump_stack