RoonReady: who controls the sound?

When Chris over at CA raved about the Aurilic Aries driving his Berkeley DAC, he mentioned how he heard new things in his music. First off, this discussion assumes that a digital music player has a “sound”, and that the Aries would sound different than another streamer, or a desktop or laptop connected to the same DAC.

That then begs the question, when using a RoonReady device (like the Aries), which player of the bits (Roon or Aries) determines the “sound”? If it’s Roon, that means one shouldn’t bother (over-)spending on the streamer as all RoonReady endpoints will sound the same. Maybe this boils down to: who controls the conversion from bits in a file to PCM or DSD?

Does any of this make sense? I feel like I don’t understand enough of this to even ask a valid question.

I think anyone that can answer properly will understand your question so don’t worry - it makes sense.

I’ll summarise the two answers for you before they all come in:

A. All bit perfect players/computers sound the same - there is no ‘sound’ per-se - it’s just delivery of digital data to the DAC. Sound differences are imagined.

B. It’s really complicated and players and protocols may or may not influence sound depending on your system, ears, beliefs, etc - for a few reasons and not all well understood or simple to explain.

Enjoy :wink:

6 Likes

:stuck_out_tongue:

Hi Mike,

This KB page on RAAT and this thread on clock ownership may be of assistance in understanding how Roon Ready devices work with Roon in comparison to other systems and protocols.

There are other opinions about this. Different bit perfect players or computers can use more or less intensive processing leading to noise propagating via ground connections. Ethernet with appropriate ground isolation (microRendu) optical or Wi-Fi (Aries) can avoid any galvanic connection which carries such noise.

1 Like

So from the linked post on the clocking thread (a few replies further down),

the assertion is that the clock in the streamer doesn’t matter when using RAAT (the only clock that matters is the DAC), therefore there’s no reason to buy the more expensive Aries with the femto-clocks for this usage. Is that a valid extrapolation from @brian’s comment?

1 Like

Yes, I believe so. With a RAAT connection [to USB DAC, see Brian’s post below] only the clock in the DAC matters. The Aries has other virtues including an excellent Wi-Fi implementation and a low noise environment, but, imo, it’s not good value for money if you are looking at using it only as a Roon endpoint.

It depends on how you’re using it.

If a streamer is speaking to the DAC using USB, then yes the DAC’s clock running the show and the femto-clock in the Aries isn’t involved.

If the streamer is speaking to the DAC using S/PDIF, AES, or another similar source-clocked interconnect, then the clock in the Aries is being used to generate that S/PDIF signal. In this sense, clock quality could impact what’s going on downstream at the DAC, depending on how hard the DAC’s input stage works on the problem of “cleaning up” the incoming signal.

I would not spend extra $$ on a “femto” clock from any manufacturer without doing an ABX test with the DAC that I planned to use it with. Over USB, it adds nothing, and over S/PDIF, there’s a better than even chance that the DAC is using asynchronous resampling or re-clocking in its input stage–which greatly reduces the impact of the clock in the streamer, even over S/PDIF.

2 Likes

That was my alternate answer - B? :slight_smile:

1 Like

Ah, I see. Yes, the two schools of thought are as described.

bought a femto Aries about one month ago and, based on my imagination, found power supply can make a dramatic difference (I’m powering mine from a JS-2 LPS altogether with the Mac mini running Roon Server)

maybe you might get an LE and use the extra money on a very good LPS
maybe :wink:

This is the case with every use of these interfaces, nothing special about Roon here, correct? That is, my understanding is that the USB receiver chip will “reclock” the signal whereas the SPDIF signal will be the driver of the DAC clock. I might be incorrect of course, and there are surely some implementations where all signals get reclocked. I would appreciate if someone can authoritatively clarify this.

1 Like

There’s a lot of subtle incorrectness in your paragraph…let me clarify.

This is the case with every use of these interfaces, nothing special about Roon here, correct?

You are correct that Roon does not change how USB or S/PDIF works. No, it is not the case that every use of these interfaces works the same way.

Remember, we make software and protocols, not hardware, so the relevant comparisons here are against other software and protocols.

Some streaming protocols, including AirPlay and Songcast, use the system clock on the media server to determine the rate that the stream is transmitted to the playback device.

RAAT exposes the device’s clock over the network so that Roon can slave to it. In the case of a USB device, it’s exposing the clock embedded within the USB device. In the case of a S/PDIF transmitter, it is exposing the clock in the transmitter. In the case of a Roon Ready that doesn’t use USB or S/PDIF internally, it’s exposing the DAC’s internal clock.

Whenever you have a clock generating content at one rate, and a second clock receiving it at a different rate, something must be done to reconcile the discrepancy. Two clocks will always keep time at a slightly different rate. In most cases, the rate changes meaningfully based on ambient temperature.

Resolving that discrepancy involves engineering tradeoffs. Expensive systems use PLL to “bend” the receiving clock to match the sender. Less expensive systems, including most consumer-grade stuff, perform lossy signal processing (either re-sampling or “stuffing” and “dropping”) to reconcile the differences.

So, when you compare RAAT (a network protocol) to AirPlay (another network protocol), both sitting on top of the USB device, the AirPlay configuration has two clocks and a lossy clock drift compensation mechanism somewhere in the system, and RAAT is using a single clock and doing bit-perfect playback.

That is, my understanding is that the USB receiver chip will “reclock” the signal

The word re-clocking implies that there are multiple clocks. There’s no necessary **re-**clocking with async USB devices. The USB device contains a clock, and requests data from the computer based on that clock’s rate.

whereas the SPDIF signal will be the driver of the DAC clock.

This is only true of the cheapest DACs. Almost everything used in a HiFi/Audiophile context is designed to perform some form of re-clocking or asynchronous resampling of S/PDIF input.

3 Likes

Ok, so correct me if I’m wrong here… When RoonReady runs over the network, the device’s clock is exposed so that you can sync multiple streams. For this sync, the added reliability of a femto clock is irrelevant (USB or SPDIF). Now I am no USB expert by my understanding is that the transmitter chip requires a clock to create the “squarish” signal while the receiver chip will detect the squarish signal and “reclock it” (or make into binary numbers with the receiver chip clock). In this case the only use of the femto clock on the transmitter side would be to reduce jitter which ought to make the receiver chip’s job less hard… Am I correct here?

Ok, so correct me if I’m wrong here… When RoonReady runs over the network, the device’s clock is exposed so that you can sync multiple streams.

Yes.

For this sync, the added reliability of a femto clock is irrelevant (USB or SPDIF).

Yes. Femto-clocks are more accurate with respect to jitter, but not necessarily in terms of clock rate.

Now I am no USB expert by my understanding is that the transmitter chip requires a clock to create the “squarish” signal while the receiver chip will detect the squarish signal and “reclock it” (or make into binary numbers with the receiver chip clock).

I might not completely understand which “Transmitter” and “Receiver” you’re talking about. I’m assuming “Transmitter” is “computer” and “Receiver” describes the USB receiver module within a USB DAC (with DAC defined as “the whole product” not “the DAC chip”).

In USB Audio streaming, the USB device says “I need another block of audio data soon” and the computer gets it together and sends it. The block contains many discrete samples–probably 10s of milliseconds of audio. The USB device has a large enough buffer internally that it can tolerate some time passing in satisfying those requests for blocks of data, so the exact timing of that message and its response do not really matter.

The DAC chip is consuming samples from the USB receiver at its own rate (in reality, a common clock is likely driving both in some form of lock-step).

When the buffer in the USB receiver drops to a certain level, that triggers another “I need another block of data” message, which causes the computer to send more data, which refills the buffer.

So while the USB receiver is clocking out the data, but the data was not previously transmitted according to another clock, so there is no re-clocking going on.

Another way of saying it: The USB DAC is pulling blocks of data from the computer exactly as Roon might pull blocks of data from a file on disk: asynchronously, and on demand. We don’t say that Roon is “re-clocking” the buffers when it takes them from the disk and re-transmits them over USB.

this case the only use of the femto clock on the transmitter side would be to reduce jitter which ought to make the receiver chip’s job less hard… Am I correct here?

Yes.

Bringing this back to the Aries, which was discussed earlier in this thread: The femto clock cannot have an impact in USB mode since when the Aries is running in USB mode with RAAT, the clock driving the system is the one on the USB device, not the one in the Aries. The Femto clock only serves to decrease Jitter in the Aries’ S/PDIF and AES outputs, since in that case, the Aries is actually involved in clocking out data directly.

5 Likes

Thank you Brian. Yes, by transmitter and receiver I meant computer and USB module in the DAC.

My understanding is that devices like the Regen work because the receiver chip needs to do more work recreating a digital signal out of a “squarish and jittery” signal than if the signal was less distorted (faster transitions from the 0 and 1 voltage levels bc of better impedance matching for example) and less jittery (not sure why this matters but jitter and a slow rise/fall are very close to the same thing). This lesser amount of work translates in less ground plane noisy at the receiver end. So the thinking is that, if the Aries or Regen Femto clocks are able to give you very little jitter and very high slew rate, then the receiver chip has a lot easier a job to do and less noise is introduced.

I understand the async USB mechanism of packets etc, but the above might still matter.

But like I said I am not an expert and all I have observed is that the Regen and W4S RUR do make my system sound better over USB.

[and I understand this is not a thread about the Regen or RUR, but this is the argument behind the Femto clocks in the Aries, at least one of them.]

1 Like