Tidal Desktop application sounds better than ROON on the same box and with the same driver, why?

I gave a very resolving system. The difference in SQ is so evident that even my wife who is not an audiophile and not even a music lover can easily identify. As a matter of fact, ROON sounds worse than Tidal desktop, Audirvana and JPlayStreamer. I tried all possible configurations e.g. adding HQPlayer, Linux endpoint, MAC endpoint, RooCore and RoonBridge together and separated, ROON Core on QNAP NAS, core i7, i5…

I tried upsampling to PCM and DSD. And nothing makes ROON sound as good as the players I mentioned, really disappointed.

As a lifetime subscriber I hope for SQ improvement with every new release but it seems to me that the ROON team puts most of the development efforts into UX and not SQ.

Protip for future contributors: neither of these are actually evidence of anything.

“A “blind test” is a method of testing in which the people being experimented on have no idea about what they’re getting. This test method prevents results from being influenced by any a priori information. In the field of Audio, blind tests truly highlight what a listener is able to hear.”

Your anecdote is not a blind test.

I’ll leave this now, as I can only see this going one way. Next time you could just answer straightforwardly yes/no, or just ignore.

As a matter of fact it was a blind test. But I’m going to agree with you it’s better to leave it now.

2 Likes

Why did you buy a lifetime if you didn’t like how it sounded?

I started this thread to get users’ opinion on my ROON configuration. My question was if I was missing something in my ROON setup. Out of more than 20 replies only 1 from @brian was to the point.

Please don’t get me wrong, ROON sounds quite good. But the other players sound better in the same environment. The question is WHY?

Possible reasons why:

Volume levels aren’t 100% the same on comparison, expectation bias, JPlay is massaging the data. If like you say any random person off the street could tell the difference between JPlay and Roon I’m going to say that JPlay isn’t bit perfect. The other reason could be that your DAC is so poorly implemented and extremely sensitive to computer noise it only sounds good on your Mac with JPlay.

PS) It’s pretty hard for allot of us to take you serious when you try to pull the resolving card. Just a note to put in your audiophile etiquette playbook for the future.

You don’t even bother to read the discussion before starting replying. My initial post was about SQ difference between Tidal desktop and ROON with the same settings and the same driver. I already answered about volume leveling etc. if you have nothing to contribute please ignore. I don’t need anybody to educate me on etiquette, sorry.
It’s so easy to download Tidal desktop and compare it yourself. If you don’t hear the difference then great, there is nothing to discuss.

My system has been designed to be immune to upstream changes so the source of the bit perfect data stream doesn’t matter.

So it’s obviously your DAC then. It’s so sensitive to computer generated noise that it doesn’t like how Roon uses computer resources and the computer noise that it generates being transmitted to your DAC.

If Roon still didn’t sound as good on a different end point, again it could be the noise generated by that endpoint and how you DAC doesn’t agree with it.

I tried it with 3 different DACS and the combination of components:

  1. Musical Paradise MP-D2 + Uptone USB Regen + JitterBug
  2. IFi micro iDSD + iUSB + linear PSU
  3. Arcam rDAC + Audiophilleo2 USB to SPDIF converter

As you can see the computer noise is taken care of. The Tidal desktop still sounds better to my ears.

“De gustibus non est disputandum”

2 Likes

This is not a statement you should be comfortable making.

Managing analog domain effects like “computer noise” isn’t something that is or is not taken care of–noise is continuous, not discrete. If we’re being intellectually honest, the best you can say is “I bought and installed some products that claim to mitigate computer noise” not “the computer noise is taken care of”.

The scientifically accepted methods for measuring perceived effect are generally impractical in an audiophile context (gathering a representative population, controlling for interacting factors, designing/executing/repeating a double-blind study that proves statistical significance of an effect). Some fields invest in this sort of rigor (medical testing, for example), but we are not in one of those fields. I think sometimes we lose track of that perspective when discussing this stuff.

So what we’re doing here is not stating why things are different–we are theorizing about why they might be. No-on in this discussion is prepared to do a rigorous experiment, so we shouldn’t treat our conclusions as facts. In these discussions, I often ask myself–would an expert feel comfortable making this statement with 100% certainty? And the answer is seldom “yes”.

Anyways…

The first thing to be sure of is–are the bits going to the DAC the same in both cases? When USB is involved, I like to verify this by sniffing the USB bus. If the bits are different, then the investigation stops there. That difference must be eliminated before proceeding to other possible explanations.

But lets assume that the bits are the same. What else could be causing a difference?

Since we’re talking about USB, it’s important to understand a few things about UAC 2.0–the protocol spoken to your USB device–is an asynchronous data transmission protocol with error detection:

  • Asynchronous means that it decouples the exact timing characteristics of the two sides of communication. Instead of requiring that data arrive “in real time”, it must only arrive “in time”. Because of this decoupling, variance in arrival time of USB packets do not cause corresponding timing differences in rendered audio samples.
  • Data transmission means that the protocol transmits information, not audio signals. That means that the rules of information theory apply to this process.
  • Error detection means that if data is damaged in transit, it will be dropped instead of being rendered with subtly degraded quality.

After transmitting data through such a system, we can be sure that if the system hasn’t failed catastrophically (dropouts or playback stopping) that the information has arrived at USB interface module within your DAC correctly, and in time for playback.

Taking a step back, if data is the same, and neither Roon nor “X” are causing the DAC to fail totally, then we can be sure that the data itself is not the channel through which qualitative differences are becoming perceptible.

So what else is there?

Since the information is provably intact and identical, and a sound quality difference is perceived, the cause for the perceived difference must be caused by something out of band of the information stream.

Here’s a theory among 20 other theories I could come up with. I’m not claiming it to be true at all, just floating something so we can pick it apart:

When TIDAL transmits audio to the TIDAL app, the data is delivered in a FLAC container. When Roon transmits audio over RAAT, it is transmitted as PCM. The PCM stream is roughly twice the bitrate of the FLAC stream, so it creates extra work for the network interface chip–but–the FLAC stream requires an extra decoding step on the CPU that is not required with RAAT.

Both the CPU and the networking hardware radiate electromagnetic interference which may influence other components in the audio system. Maybe Roon and TIDAL sound different because effect size of the increase in network traffic is larger than the effect size of the increased CPU activity in this particular computer.

For a second, assume that this proven fact–would this be proof that Roon’s approach is inferior? Should it be an argument that Roon makes a change?

What about other setups? Maybe it’s only this particular network chip, and other network chips emit EMI at a consistent level regardless of traffic. Maybe the variation is actually related to a detail of the driver not the chip. Maybe it’s only one revision of the chip. Maybe the chip isn’t at fault, but rather the arrangement of the passive components around it. Maybe it’s the driver, and the next driver version fixes the problem. Maybe some of those in your machine came from a bad batch and aren’t within tolerance, but aren’t bad enough to make the interface stop communicating. Maybe, …

In audiophile discourse, people tend to black-box components in the system at the level of our personal understanding. “computer noise” is an imprecise term–it does not belong in a rigorous discussion. “network chip” is also imprecise. I am not an electrical engineer, but I can think in great detail about how software and networked systems work inside. Those details feel bigger to me than hardware implementation details that I am not as familiar with. The difference between how large an implementation detail feels in your mind and the actual objective effect it has on sound quality can be huge.

But more to the point–that whole line of reasoning was conjecture. We live in the real world, where we must make decisions about how to build software. How sure should we have to be about that theory in order to act on it? More sure than I am about that one, surely.

TIDAL made their choice to transmit compressed FLAC because it’s cheaper and more reliable for them to do things that way. We made a choice to decode/process audio in the Core and then transmit it as PCM because our architecture accommodates lightweight endpoints that cannot do that processing themselves. TIDAL works on my phone in the car. Roon does not. Very different products, very different architectures, very different choices. More importantly–the choices were not made with the goal of increasing sound quality in a particular system in either case. The effect that those decisions have on sound quality is incidental and accidental.

There are billions of possible combinations of DAC, Amplifier, Speaker, Network Switch, Computer, Software, … And that’s just stuff you associate with audio in your mind. Your audio gear is just as connected to the power grid. It may source its energy differently at night. Your refrigerator cycles on and off with a mind of its own. Cell phone traffic from the nearest towers varies at times of day. I’m not making claims about the effect size of any of these things, just trying to bring some perspective to how complex things get once this door is opened.

No experiment conducted on an audio system installed in a home is ever well-controlled for these factors. And no manufacturer can effectively control for them either. Companies making audiophile products do not have unlimited resources. They work with their own products, and perhaps a few representative configurations for QA–not the full field of system permutations that the products might eventually be a part of.

But if these out-of-band effects matter (and many audiophiles clearly behave as if they do), then who should be responsible for them?

This is how I would look at it, assuming “different bits” has been ruled out as an explanation…

First–Each product is responsible for what happens inside of their walls, for implementing interoperability standards properly, for minimizing forms of interference that it may radiate, and for not accepting interference that it may receive. This set of rules makes the world go round. We should hold all manufacturers to this standard.

Second–if the theory is that the DAC is accepting interference from the network chip (or any other source), the party most equipped to fix this is the DAC manufacturer. Help them reproduce/measure the effect, and they can work towards a technical solution. This is where most of the accessible improvements will be in practice.

Third–since the effect isn’t in the information stream itself, I would direct skepticism at the filtering/isolation add-ons involved. Isn’t their stated purpose to mitigate out-of-band effects to the point where they become inaudible? If they are working, why is such an obviously audible effect making it through?

Fourth–if interference is traveling through the air rather than through wires, increasing space makes a difference because of the inverse square law. This is why one of the very few sound quality recommendations that we make is to put space between products that must be in the listening room and products that could be moved outside (like media servers). Physics tells us that if there are out of band effects traveling through the air as RF, increasing space will mitigate those effects by a certain amount.

Finally…

I’m interested in doing whatever can practically be done to improve perceived sound quality.

We receive all kinds of sound quality feedback–positive, negative, sounds the same as. Sometimes the privately stated opinion of the manufacturer is very different than the opinions of their users (of course, no-one wants to be rude and say “we think you’re wrong”, so they rarely make those statements in public).

Very little of the feedback we receive is technically actionable, or even reproducible. Yes we try. Often.

The people in the best position to criticize our handling of audio in a technically actionable way are our hardware partners–and they rarely do. The most useful feedback is “you are doing X wrong. It affects our product in this way. Please fix”. Of course we take that kind of feedback seriously and act on it when it comes (these days, not so often…).

21 Likes

I agree with Gu-Gu, I love Roon and am not complaining, my Roon is on an MacMini and I have two different Dacs. For MQA I use a Merdian Explorer 2. For non MQA I use the Optical out on the MacMini to a Prism Lyra 2. I’m an audio crazy with a very resolving system. The difference in the Tidal app and Roon is very clear as Gu-Gu states. The Roon app has a great remote, the Radio lets me explore my large CD library. I also listen to a lot of Vinyl with a large record collection. My point is SQ is about something I’m sensitive. I also think the MQA on both Roon and Tidal is much better than non MQA.

I got a coupon for 2 months Roon trial when I bought my AQ Dragonfly Red and the reason I originally got annual membership (not the lifetime I switched later) was the difference I saw between Tidal Desktop App and Roon (Roon was audiophile ear better). I have to confess that I did not change any setting in Tidal Desktop app. The only difference I have with @Go_Ga is my Roon Core runs on Ubuntu Linux and my desktop that Tidal runs is Mac. As you may know Mac and Linux have better native USB drivers than windows. Schiit provides a USB driver for Windows and not Mac and Linux. I have never run Roon on windows, but Plex server (I use it for video) on Windows performs so bad compare to Linux, due to OS dispatching algorithm of Windows(mediocre at best). Movies that I have recorded digitally on windows all have distorted gaps throughout the movie . Again, I have to confess that in this case I tried both windows 7 and 10 and not the expensive 2012 server that @Go_Ga uses.

Hi @brian,

Thank you for taking time and posting such a long post. Unfortunately, your post explains nothing to me. Yes, I agree there are too many variables here however there is a number of audiophile players that consistently sound better than ROON in the same environment. I’m sorry but as the end user I don’t really care about how many different system your product is intended to run on. All I care is how it sounds in my system. I don’t want to sound like a troll but I also don’t buy your theory about “interference traveling through the air rather than through wires” (this must be a joke) and other voodoo stuff. I still like the UX of your product however.

Best.

Have you ever heard that sound that car stereos used to make when a GSM cell phone was about to ring? It became less common over time because people started filtering those frequencies out explicitly on the “receiving” side as cell phones became more popular, but that’s an example of RFI that many people are familiar with.

We have a big library of hardware here–I’ve run into more than a few DACs that pick up similar interference from normal household stuff. One that is sensitive to a displayport monitor when it’s doing heavy screen updates…another that is sensitive to WiFi transmissions nearby, and so on.

This is without direct connection–and easy to hear by playing silence through the device, turning the volume up, plugging in headphones, and creating the interference.

1 Like

LOL :joy: yes it was about 15 years ago last time. My DACs are dead silent when no signal even if a crank the volume

it’s a statement about design philosophy and the things we value during the design process. Other players specifically try to be as light as possible. We do not treat that as a primary design goal.

We do not treat treat lightweight as a priority when designing the core. Our library management is heavy. We keep large search indices in RAM. We do background processing–syncing your TIDAL library, updating metadata, audio analysis.

Some lightweight/audiophile players explicitly avoid this kind of thing, because they want to create a quiet, resource-light environment for playback.

We take another approach–provide lightweight endpoint implementations (Roon Ready, Roon Bridge) and let the core do what it needs to to produce the best UX.

1 Like

All of the hardware that partners sends us passes through my hands. You would be surprised how many modern, well-regarded DACs are not silent under these conditions. It’s definitely not the majority, but more than just a couple…

1 Like