In this enlightening thread about RAAT and Clock ownership, @Brian mentioned at least two buffers in Roon’s playback chain. It sounds like those buffers may reduce data packet retransmissions. Such retransmissions traversing a network is believed by many to have a significant negative impact on sound quality. It’s one of the objective arguments for premium ($$$) Ethernet cables; to reduce or eliminate said retransmissions.
What I’m wondering is if Roon’s architecture negates such concerns - assuming a simple Roon architecture, with a Roon core, music on local drive, connected to a Roon endpoint over a wired network, with cables that are demonstrably up to CAT5 spec (or better) - such that potentially negative sonic impacts of retransmissions is substantially reduced, if not eliminated?
What is the point of -specifically- mentioning Ethernet cables if you don’t want to discuss Ethernet cables? “Objective arguments” are one thing, but has anyone produced “Objective evidence” showing that audiophile Ethernet cables measurably reduce or eliminate packet retransmissions that otherwise occur with in-spec non-audiophile cables that aren’t defective? If so, I would love to see a side-by-side comparison of those log files.
I’ll open by saying that I do sell a couple of different options in Ethernet cables, but won’t comment either way on their efficacy.
I’ve seen various discussions in various places online related to premium cables reducing network re-transmits and all I can do when I read them is chuckle. I used to burst out in laughter, but the repeated introduction of this argument doesn’t hold the entertainment value that it once did.
Roon’s buffers are in place to ensure that the DAC at the end of the chain always has the data that it needs in order to continue playback. There are plenty of things that can go wrong between point A and point B and having a healthy reserve of data next to the DAC ensures that a glitch in the network (or server / client) doesn’t impact the DAC’s ability to fetch the next sample.
Regardless of how a packet got lost or corrupted a retransmit should be transparent to the application (Roon or otherwise) and there are various facilities built into the networking stack to ensure that data doesn’t get permanently lost in transit.
A retransmit can happen for various reasons, but in a typical network any number of retransmits suggests that something is fundamentally broken. If one is seeing retransmits due to overload of the networking hardware (too much traffic) then no cable in the world is going to make the situation any better. If one is seeing retransmits due to a bad physical connection then that’s just a bad cable and it should be tossed.
In other words, on a typical home Ethernet network (just ignore WiFi in this discussion) there should be NO retransmitted packets at all assuming everything is functioning correctly. The argument that a “better” cable is going to solve your audio problems due to a reduction of retransmitted packets signals a complete failure to understand how Ethernet works.
If the cable manufacturer is using this argument then someone needs to be fired because the other way of making that statement is, “We at Cable Company X know very little about how Ethernet works, now buy our expensive Ethernet cable!”
If an audio dealer is making the argument then the dealer is revealing his fundamental lack of knowledge of how networking works and is simply parroting what some manufacturer / forum / guru told him.
Again, I’m not making a pro or con argument regarding the ability of a premium cable to improve sound. All I’m saying is that any gains in performance have absolutely nothing to do with retransmitted packets!
Thank you. That is exactly what I’m seeking firm confirmation of.
Why? As you indicated AMP, short of a fundamental flaw in your network, if the playback is bitperfect - with no gaps in timing - then the argument for needing quality cable to reduce/eliminate retransmissions goes away.
But let’s be clear… confirming bit perfect is not the same thing as all the bits arriving in time. Which is why knowing that buffers are sufficient to eliminate the need for retransmissions, then that should be good enough.
The problem here is that many people try to understand network data transmission using all of the buzzwords that they’ve encountered relating to digital audio. While the two are fundamentally similar and share the same vocabulary it’s not even remotely correct to approach them in the same way.
Networking is ALWAYS bit-perfect. Period. If data is mangled in transit then that’s identified at the receiving end with a checksum and a replacement packet is requested. This can (and does) happen and it’s transparent to the user. The cause can be anything from EMI/RFI to failing hardware. The data at the remote end of the connection must always match the data sent. It’s up to the networking stack and the application to ensure that this happens, otherwise the application and/or the network are fundamentally broken.
This isn’t to say that the data always arrives at the receiving end in the same order that it was sent or that it all arrives in a timely manner. It usually does, but the networking stack is designed to address the situation when it doesn’t using a combination of buffers and retransmission of data.
Roons buffers have nothing to to with retransmission of data. That’s handled several layers down. Roon needs additional buffers since audio is a real-time process. If the next sample isn’t ready and waiting then playback stops. While the network can and will ensure that the data will get there (eventually) audio playback needs some extra insurance to make sure that there’s always something for the DAC to play.
For example, take a DXD file (24/384). The data rate at playback is 18.4 Mbit/sec (24 bits/sample/channel * 384000 samples/sec * 2 channels). Assuming a gigabit network connection then you have a pipe that can move data at a rate of 1000Mbit/sec as well as harware at both ends that can (hopefully) process data at that rate. In other words, throw a ridiculously huge audio data stream at your network and your network isn’t going to even remotely break a sweat.
When playback starts Roon allocates a buffer of some size on the playback client and then front-loads a buffer worth of data to the client. In the case above the DAC drains the buffer at 18.4Mbit/sec and as it does so the Roon server sends more data to keep the buffer topped off. Should a networking glitch occur then the DAC has some number of samples (which corresponds to some amount of time) sitting in reserve to decode. As long as the glitch clears in time the Roon Server can push a big chunk of data to quickly refill the buffer and the user will never know (nor be able to hear it happening).
Roon’s buffers don’t eliminate retransmits, they provide a mechanism to deal with the situation so that playback is continuous.
Well… actually that is not true for real-time protocols. Though very few people actually run into those. In real-time protocols bit’s get lost, and they don’t get replaced. 'Cuz it’s old news, so no need to resend. That’s not always the case for streaming though, because typically it’s not really real-time. It’s just continuous delivered at a prescribed pace.
Well I think that speaks to the meat of the matter that I’m looking to understand. The bottom line appears to be that the required data will be where it needs to be - in the time frame it needs to be. If so, one should never worry about premium cables - for the purpose of eliminating/reducing retransmissions - when the data you are trying to have appropriately delivered is Roon data.
As far as I know all of the various methods of networked audio playback use some sort of buffer on both ends of the connection in order to address network quirkiness. This is just good design and depending on the sample rates used that buffer doesn’t need to be large at all. Even Devialet’s Air protocol (which is one of the most unreliable ever conceived) does some buffering on the client end.
I would change your last statement as follows:
One should never worry about, nor justify, premium cables based on eliminating/reducing retransmissions.
Only if you’re looking for a project. The differences between cable types really only show up at high throughput (i.e. not audio) on long cable runs. Nothing you’re doing with Roon is near those limits.
Indeed. When we bought our home 20 years ago I spent a couple of days in the attic running a lot of Cat 5 (which is fun with a 1930s home ). When we remodeled in '07 I retained what was there and ran a lot more, but used Cat 5e. Depending on the location in the house I have a mix of Cat 5 and 5e to different locations and without looking at the patch panel couldn’t tell you what ports are what. I’ve never found any difference in real-world performance between the two.
On thing that is worth noting is that shielded cables can create some potential problems. By design Ethernet is transformer coupled which means that there is no electrical connection between the two sides of the connection. This eliminates the potential for ground loops between devices… a very good thing for audio. The problem is that if you use a shielded cable you now are creating a ground connection between the switch and your endpoint (so therefore the rest of your system). This creates a huge potential for bringing noise straight into your system which is what we’re all trying to avoid in the first place.
This is especially problematic if you are using a switch that doesn’t have a 3-prong (grounded) power cord. Assuming that device has provisions for shield contact (many don’t, but some Apple devices do for sure) then your system is now the grounding point for that noisy device and potentially a grounding point for anything else connected to it!
I think the OP’s question has been anwered, thank you. So hopefully he’ll forgive me going OT…
Now I AM learning. So what does that mean when virtually every SoHo router and switch on the market has a two pronged plug - be it on a wall wart, or on the cord leading to an outboard transformer? If the two plugs are polarized (big blade and a small blade), and your socket is up-to-date (polarized), and if your house is wired correctly, does that mean the issue you are alluding to is not an issue?
I don’t think polarisation saves the day. The problem is our old friend the earth loop. The potential for common mode noise on shielding is well known to network engineers. They don’t worry about it in a network context because low frequency interference on their ground poses no risk to data transmission. Connecting an audio system ground isn’t a concern to them, but it is to us because that noise can propagate into audio equipment. Best practice is to avoid shielding, cut it, or use a device where the shielding does not make contact with any internal ground.
Thanks Andybob. This is getting into an area where my knowledge would not fill up a thimble, but would - in fact - leave it mostly empty.
That said, I find this interesting because my understanding is that AudioQuest puts puts great stock in the shielding of their premium Ethernet cables. And while I CANNOT understand their marketing materials (over my head), I gather from some simple layman’s language sources that the amount of shielding increases with each more expensive product.
So… either I don’t have the knowledge to see there is no real conflict here. Or something’s not adding up with thinking on this thread where = shielding bad. AudioQuest marketing = shielding good.
PS - you know me well enough Andybob, to know I’m not taking a poke at you. Just trying to understand.
No problem Steve, I know that and I’m just citing stuff I find online rather than speaking from experience. But I would put more weight on the experience of folks like Brian and Andrew than marketing materials. I’m afraid the answer might be how else could they justify the price.