Remote Mac with Android control? [Solved]

My primary Roon music library is on a Mac connected to my main system using an Auralic Vega DAC. I have several secondary systems that are also based on macs and remote in to play music. Both of these macs are connected to iFi DACs. This works fine as long as I control from the computer. Recently I started using Roon on an android tablet. I can connect in fine and play music on my main system that is connected to the primary music library, but I can’t seem to control the other two systems that are connected to “remote” macs. I’d like to run one of these computers headless, so it would be really great if Android could control it. Is this possible?..am I just missing a setting somewhere?

Thanks,

Robert

Hi Robert,

You’re not doing anything wrong, but at the moment Roon doesn’t support a Remote controlling a private zone on another Remote. That will be coming soon with RoonSpeakers. Until then, you will only be able to control private zones on your secondary Macs from the Remote running on the secondary Mac.

Some detail about RoonSpeakers can be found here

Thanks…good to know that eventually I will be able to do this.

Will Roon speakers also eventually allow private zones to be “synchronized”…meaning all playing the same music at the same time. I know that now this requires identical DACs across the zones. Eventually will you be able to have different DACs on each zone, but still have the same parent music file being sent? I thought I remembered that this might be a feature of Roon speakers.

Danny went into a bit more detail about grouping below:

As I understand his comments towards the foot of that thread, grouping USB zones and between zones of unlike types is on the roadmap.

Let’s ask @danny or @brian if they can tell us anything further about grouping and the forthcoming release of RoonSpeakers. Are you guys planning improved grouping straight up or is it likely to follow on after an initial release ?

thanks, I think Danny and I discussed this months back. the mixed endpoint is certainly challenging because roon gives the clocking to the endpoint (roon speakers)…which is generally a good thing. but one consequence is if you try to send a DSD128 file to one DAC that can handle that and another that only does DSD64 or just PCM 24/192 things can get complicated when you are trying to get them all to play at the same time. likewise no idea how this will work with hqplayer integration…I would likely be running that on all the endpoints.

I haven’t heard of a solution that enables synchronisation between different resolution streams (it may exist, my experience is pretty limited). Creating a solution across a wide range of hardware does not sound simple and, if implemented successfully, seems likely to become a de facto standard.

It will be interesting to see how HQPlayer integration is handled and whether it will be a separate zone (perhaps in the same way that we see different USB zones for ASIO and Wasapi) or whether it will be integrated into a stream (as volume control and crossfade are). My guess is that because it is hardware dependent we will see it in Audio setup, but that is pure speculation.

Let’s ask @danny or @brian if they can tell us anything further about grouping and the forthcoming release of RoonSpeakers. Are you guys planning improved grouping straight up or is it likely to follow on after an initial release ?

That’s likely to follow later.

Worth noting:, as soon as RoonSpeakers software implementations are available, you’ll be able to effectively group local devices by running RoonSpeakers on the same box next to your server and configuring your outputs that way instead of as “local outputs” within the app. There’s also the possibility that local outputs will be switched over to use the RoonSpeakers code under the hood always, which makes them groupable in a different way. We haven’t decided yet.

Cross-technology zone grouping (meaning grouping AirPlay/Meridian/etc networked zones with Roon Zones or local outputs) is a more ambitious project. With some technologies, it’s seemingly straightforward. With others, there’s just no way to control the precise relationship between time-in-the-universe and time-in-the-audio-stream. No matter what, it’s a testing/QA nightmare. The number of permutations and potential bad interactions once you get a few technologies involved in the same stream are mind-boggling. We’re not afraid of this sort of thing, but there’s a good reason why no other product does this.