Roon Core on Laptop; actual purpose of ethernet?

Don’t keep such an open mind your brain falls out. Which OS you’re using is not going to affect which bits are sent out to your DAC. This is not HiFi – it’s computer data processing, and all the standard rules of that apply.

3 Likes

Going back to OP….

The purpose of wiring your core via an Ethernet home run directly to your router or primary switch is to reduce latency in the most delicate / sensitive to latency payload you are likely to have in your house. Nothing else you do will come close. Streaming music directly from Tidal or Qobuz or an unprocessed file directly from a NAS is nowhere near the same challenge. If Roon sold a “does no processing or multi-room sync” version so it was just a fancy remote then the complexity / sensitivity would be less… but then Roon would not be Roon… (whether you want those features or not).

Edit: and I should say that all that will do is to reduce dropouts / delays / gaps. I personally do not believe it could in any world create a difference to SQ — unless you believe that dropouts are bad for SQ, in which case, count me in.

@Bill_Janssen it is HI-FI, otherwise most of us wouldn’t be here :wink:

I think your reply, and one or two others are missing the point. It’s not just about which OS is being used, it’s everything else involved and the machine running the different OS.

Open mind is definitely the right way to be, we’d all be a bunch of ignorants or still listening to vinyl if it wasn’t…

Umm, I am still listening to vinyl a big portion of the time, nearly half. And I have a very open mind in the analog portion of the signal chain, and a fairly closed mind in the digital portion. Different strokes, I suppose.

One thing to note: It’s important to understand the difference between latency and jitter. One could argue that, if your stream whether over wifi has latency A and over Ethernet has latency B, and if A and B are constant, then why should you get dropouts etc, right? Because the buffer in your device should continue to be full with the packets after the first time the latency has been overcome.

The reason is because you don’t have zero jitter—defined as the rate of change in latency—which is the delay in a network. So the packets come in order but the buffer gets filled quickly sometimes and is empty at others. (Same as traffic on a highway—think of all those phantom backups.)

People a lot of times speak of latency while conflating with jitter. People often ask: why am I having problems with my broadband even though I have 500Mbps download? Well, it’s because of jitter. (I experienced this having Google fiber while living in Silicon Valley over the years, half mile from Google HQ. Had trouble streaming a 128kbps stream on a gigabit network!)

That’s why Ethernet almost always is more reliable than wifi. The jitter is much more likely to be low on Ethernet.

And jitter means your sample points aren’t where they should be

1 Like

There is no jitter over TCP/IP. Either you hear perfect music or drop outs. That’s about it. The streamer & DAC receive data(!) - not music, never ever - buffer the data (if needed for perfect feed to the DAC). The DAC uses its own clock to perfectly reconstruct analog audio signal from a stashes of data packets. It the data connection (be it ethernet of WiFi) would ever fail to deliver in a timely manner you don’t suffer from jitter. The music would stop playing either like drop outs or completely. It’s ASYNCHRONOUS transmission as long it’s in the data domain.

Edit: adding a sketch how this works with (buffered) devices most of us can relate to :wink:

9 Likes

Thanks everyone! Great insights here.

This topic was automatically closed 45 days after the last reply. New replies are no longer allowed.