Separating Core and player—why would you?

It’s going straight into a Bang & Olufsen system. It accepts HDMI and S/P-DIF over coax.

Obviously, I can get something like an Auralic streamer later on. It just makes so much sense to put the Mac by the TV to replace the Raspberry Pi 2 media center, which works great, but offers nothing the Mac can’t. Two birds with one stone.

I’d guess that Gabriel is going into an AV system tied to his TV. As to why jitter matters, or does not matter, take a read of

Personally, I’d use the coax S/PDIF over HDMI but that is just my tastes.

I good DAC should be immune to jitter, by reclocking. Some of them are not.

Another concern that some people raise is that a direct connection through USB propagates not only the digital signal but electrical noise. The inside of the computer is full of activity, a very hostile place for audio. Again, a quality DAC should provide good isolation, but not all of them do. And to some extent, the electrical noise propagates as RFI, simply through proximity.

By separating them, you reduce RFI, and in principle a network connection can provide better noise isolation than USB. Again, it depends on the quality of the end point.

Wrt jitter, the network is completely asynchronous, the signal must be clocked at the receiving end, so it cannot introduce jitter. A modern USB connection is asynchronous too. But older ones are not.

When Roon 1.2 is released it is likely to include a Raspberry Pi2 image for RoonBridge. Then you could compare an Ethernet RAAT connection to the Pi with the Mini placed some distance away, with a direct connection to the Mini.

I’m not sure how HDMI audio works. There was an interesting discussion about different audio protocols here.

Cool, that would be really nice. I’d probably repurpose the RPi2 as a DAC for another room, so I’d still have use for it as a RoonBridge.

Too bad I missed the renaming event of RoonSpeakers. I was leaning towards Roon Jeremy.

1 Like

If you are interested in a DAC for the Pi, checkout the IQ audio gear. Using the HAT connectors avoids the Ethernet/USB problems that the Pi can have. There is a discussion thread here.

1 Like

I feed Roon from a laptop and Sooloos from an MD600 into an AV system, but I do it over the wired network via a Meridian MS600 with the analogue out into my AV amp. I also feed a stereo system via an MS200. Sounds brilliant.

Even if you don’t agree with the jitter/electrical noise issues, you must hear the CPU fans.

I personally can’t stand the fans or spinning disks making noise when listening (especially at low volumes).

By separating the core and the outputs, you allow yourself to build a pretty nice Core machine, without compromise, and then put it in the closet far away, while keeping your output in your listening room.

We write a bit about why you want a strong machine for the Core, and not try to go as quiet as possible here: https://kb.roonlabs.com/Why_Core%3F (the summary is: a powerful Core machine let’s us give you the best experience).

I use my mac mini as both the centre of my hifi system and as a way of viewing Netflix / iplayer etc through my TV.

I have HDMI connected to my TV but Roon is configured to speak directly to my USB DAC.

as the TV is the mac’s default audio output (HDMI) Netflix and iplayer etc continue to work perfectly every time.

Roon then happily sends its signal direct to the USB DAC without a single hitch.

Couldn’t be happier with the simplicity of this setup in practice.

However if you’re using the HDMI output for music as well and not using an USB DAC over async USB, then you open yourself up to jitter issues, just as you would on PCM.

note: as roon uses less than 5% of my CPU even with four audio zones running, my mac’s fans never kick in and the drive never spins up. It is essentially audibly silent both mechanically and as far as I’m concerned electronically, as theres absolutely no identifiable noise through my system that isn’t music.

The mechanical noise is a really good point that I hadn’t thought of. I agree the Mac mini’s fans are very silent, so I’m not sure it would be an issue. It’s not your typical PC. But we’ll see.

As for storage, I have that sorted in a closet elsewhere, so the Mac mini would only have the system, some apps and the database on an SSD, so no disk noises.

I’m going to read up on jitter as soon as possible.

Gabriel - my understanding of this issue is rather simplistic. I’m sure you will find much more robust explanations on-line. But at the risk of sounding pedantic, please allow me to weigh in…

Electrical signals to our DACs are intended to pass zeros and ones, as you say. But they are not sending, zeros and ones, just analog signals that can be interpreted as zeros and ones by the DAC. And that depends on precise timing of the interpreting device (the DAC). And that DOES get messed up at times, resulting in jitter.

In something like the TCP/IP traffic that we’ve all come to know and love, information travels in discreet packets (or bundles). And those bundles have checksums, so the recipient can do a little math on the packet received, and see if the result agrees with the checksum. If they don’t agree, it can send a “better send that packet again, something happened to that last one” message back to the sender. So it is zeros and ones that got encoded into an analog message, and ultimately decoded by the recipient as the exact same set of zero and one. This is how Roon would communicate with a network endpoint. But that has nothing to do with the music streaming into the digital input to your DAC. There are generally no checksums in that process.

If something gets messed up in that process, it stays messed up, unless the DAC figures out that it looks wrong, and approximates what it thinks it should have been. So in the case of digital music coming into your DAC (which is rendered in analog signals, merely representing zeros and ones) timing becomes a critical issue. That analogue signal is a waveform. And the troughs are zeros, the peaks are ones. But the recipient looks at the signal based on timing (like a fast metronome), and “tick” of the timing may not exactly line up with the peaks and troths. So if the recipient ticks a little early, it may see the wave as it was moving to a peak or a trough, but was not quite there yet. The leaves the recipient with a problem. That beat was not quite at peak level, nor at the lowest. Was it a zero or a one? Sometimes the resulting decision by the recipient is wrong.

And if the beat of the timing (time between the ticks) for the sender and receiver are not exactly the same (one is slightly longer, one is slightly shorter), then the chances if the misinterpretation of a zero or one increases over time. But if you are using an asynchronous connection (like USB), then the receiver can detect the problems while it’s still easy to correct and has the chance to say, “I think you are going a little fast (or slow). Please alter your timing rate to mine.” That - of course - reduces jitter.

Bottom line - it’s NOT just zeros and ones. It’s a waveform that was encoded from zeros and ones, to be able to be decoded to the same zeros and one, if - and only if - the timing of said encoding and decoding precisely align.l

Hope that helps. Sorry for the long - if overly simplified - explanation. And for all those tech cognoscenti here… if I misrepresented anything, please do correct me.

1 Like

What are some examples, of what the effects of jitter may sound like ? I honestly wouldn’t know if I’m hearing it in my setup or not, as I have nothing to compare it to :slight_smile:

1 Like

Decays on cymbals (sounding artificially sibilant or short) and lack of imaging are the main places I hear (or think I hear) the effects of jitter.

I think this is not quite right.

It is certainly correct that the digital signal is transmitted as an analog electrical signal, and that it can have jitter (timing inaccuracies) because of noise or distortion in that signal. But while it is conceivable that jitter causes bit errors in the transmission, that is extremely unlikely (because of it did, computer stuff wouldn’t work). Whether you have error correction or not (and I don’t know the details of USB protocols), they work accurately for all practical purposes including audio. Remember, if I transfer Windows or Office or Roon to a computer from a USB drive and there are bit errors, the software very likely won’t work. (Wikipedia does not even mention bit errors in the discussion of jitter.)

No, jitter is problematic in audio for other reasons than errors. And the jitter magnitude we are talking about is very, very small: people measure it in picoseconds. When we are transferring 192/24 in stereo, that’s 9 Mbps, so the bit transfer rate is in tenths of microseconds – a picosecond is a millionths of a microsecond!

Even without any bit error, jitter will introduce distortion in the analog signal that is created by the DAC. One way to avoid this, which the industry has stupidly adopted since the 80s, is to try to generate accurate timing in the origin (with a spinning disk platter, a mechanical device!) and then try to preserve it through the signal; this is what SPDIF actually requires, and USB for audio initially adopted it. A much better way is to just give up, consider the transfer to be asynchronous, get the signal into a buffer, and read it out of the buffer based on an accurate clock. That’s what asynchronous USB does, and network transmission does that too because IP is inherently asynchronous, packets are not even guaranteed to arrive in the right order, they have to be reassembled into a signal. (This approach does introduce other issues with timing for broadcast, as @Brian has discussed in these pages.)

That’s the whole discussion of “well-designed DAC”: if you do things right, jitter in the transfer should be a non-issue. With an old-fashioned DAC (“old” might mean last year :wink:) jitter is indeed a problem.

But the common discussions today about cables and computer proximity is more about noise from the originating computer or picked up by the cable getting into the audio electronics and polluting things there.

I don’t know how big an issue that is in practice. I have a Meridian 818, and I have run it from the Roon NUC with both network and USB and never noticed any problems. On the other hand, the Meridian is a very expensive piece of equipment and very well designed. On yet another hand, maybe my 60-year old ears are only 16 bit…

2 Likes

Anders

I think the point that Steve was trying to make was that computer protocols that are used for moving data over a network or hard drive to hard drive have built in error correction which resolves imperfect data packets. These imperfect packets don’t have the same time limitations to get from a to b as a bit stream. (he says hopefully!)

Cheers
Tom

I’m NOT going to claim expertise here. But I will state my understanding…

USB streaming audio is not consistent with your statements above Anders. As I understand it, USB streaming AUDIO standards do not have the error correction that I mentioned in my earlier post - packet based error correction that is built into USB data protocols. So timing (short of the feedback that is possible with asynchronous USB) becomes critical to the correct decryption of an analog USB signal to a perfectly reproduced binary stream. Whether or not a well designed DAC has means of working past this is not the point. The point is that it’s not all zeros and ones traveling to our DAC. It is NOT guaranteed bit for bit reproduction as a normal TCP/IP connection would allow. Or for that matter, as a USB DATA connection would allow (data being the operative word in your Windows/Roon transferring software example). This is USB AUDIO streaming, which - outside of the important timing feedback that an asynch USB audio connection allows - is highly dependent on the timing of the signal, except where somewhat compensated for by high quality DACs.

Thanks Tom. But more to the point… USB audio does not have said packet by packet (“Send me a new one, that last one was bad!”) error correction build in. Instead it relies on timing. Fortunately there is a feedback loop on asynch audio timing (“Speed up/Slow down please!”). If we were talking about S/PDIF that does not have such a feedback process, timing becomes even more critical.

Are you sure of that?

Of course in networking we have UDP, as an unreliable option to the reliable TCP, because of cost and perf. And of course reading an audio CD is unreliable, because 1982. But I have never heard of an unreliable option in USB, and Wikipedia doesn’t seem to mention it.

But in any case, my points were that the issue of separation is more about noise than errors, and jitter is a distortion problem unrelated to bit errors.

Anders - absolutely not! LOL.

But I’m pretty sure. And I suspect you are reading USB data standards vs. audio standards. But that is not definitive, is it? So if someone INTIMATELY knowledgable of USB steaming audio wants to chime in, please do.

Thanks.

UDP is subject to packet ordering (or loss, or duplication) issues, but not corruption. It still has a checksum that can be checked for corruption of bytes – slightly different than SPDIF’s transmission where actual bits can be read incorrectly.

USB has a similar thing, where Isochronous and Bulk transfers both are CRC checked. Bulk transfers will also do the TCP-like guarantee of delivery. Audio goes over Isochronous transfer mode.

Danny - thanks for chiming in. Apologies to the OP as this continues a bit OT, but to make sure I’m understanding…

USB audio has no delivery guarentees? But does have a checksum to ensure that the bits that WHERE received were the correct bits? Is that correct?

For if so that would seem to imply that timing issues with DACs were not at a bit level - as I thought I understood - but at a larger, multi-bit level.

Sorry for having to educate the novice. But 'Tis a wonderful learning experience.