An interesting "Linus" video on the "Audio Quality Ethernet"

… but distortion is defined by the signal changes from source to destination.
For you to hear a difference the signal must be changed, or what is your belief?

Yes, but how much must a signal be changed by distortion for us to notice the distortion?

It has been scientifically established, that there are limits to the human auditory capabilities, which, by the way, do not exceed measurement equipment resolution.
Google is your friend here, if you’re really interested in the facts…

And those limits make my point…that distortion is not as big a deal as some people make it out to be.

If the right conclusions are drawn that’s fine. Eyes on the ball. And that’s mainly speakers and the room. By a large margin. Decent measuring electronics aren’t - even down to consumer prices - a problem anymore. If not true fidelity but some colouration in sound is what one wishes for walk away from neutral state of the art equipment and fancy tubes. Nothing wrong with that, just don’t call it HiFi because it isn’t. Cables, network and voodoo in general bring nothing to the table a human could measure with their ears.

1 Like

If you read my earlier experiment and the conclusions, EMI/RFI via ordinary network hardware affecting the noise floor and therefore sound quality is a non-issue. I pulled a signal which was below the threshold of audibility (-120dBFS) up to audible using a lot of clean gain. The fact that the 1 khZ tone was audible above the noise means that the noise was also well below the threshold of audibility and therefore a non-issue.

Something I omitted to check when I ran my little experiment, was that the input sensitivity of the amplifier. By taking the preamp’s gain up to +4dB to enable the test tone to be heard, I was in fact, overdriving the power amps’ inputs. With a music signal playing, the amps’ outputs would have clipped (380 W per channel / 8 ohm) while the test signal was still 4 dB below audibility. In short, your ears would be bleeding while the test signal was still 4 dB lower than “silence”

It’s about high time audiophiles let go of the same oft-repeated myths.

3 Likes

To a degree, however distortion is also relative in terms of audibility.

This is what Bruno Putzeys has to say about it:

It’s an interesting read. Especially noteworthy is that Bruno is now concentrating his efforts in the realms where there is most still to be gained.

1 Like

And I explained to you earlier that you do not understand what you’re talking about.

You already accepted earlier that your technical knowledge wasn’t in depth enough to fully understand some of the concepts being discussed.

You claim that noise won’t translate into the analogue domain, but it will somehow compromise the DAC’s sensitive digital-to-analogue conversion process and compromise the sound quality by some hitherto undocumented mechanism? If this is your hypothesis, then please provide some factual evidence based on science and electrical theory to substantiate your claim, otherwise, I politely request that you keep quiet.

If you continue to repeatedly push the same unsubstantiated claims into your replies to my posts, I will have no choice but to mute your contributions from my feed.

Audiophiles claim time and time again that noise from ethernet can raise the noise floor of the DAC’s analogue ouput. This experiment, regardless of what you say about it proves otherwise.

2 Likes

Watch what is said with measurements about electrical noise and the DAC by the video by Golden Sound, starting 2’33"

The whole video is interesting for various topics that were discussed these days on some threads.

1 Like

At 3’07" … “The noise is being carried over the USB connection”.

What does this have to do with ethernet cables?

I haven’t watched all of the video, but it seems to me he’s talking about methods of galvanically isolating a USB signal. As ethernet has galvanic isolation (assuming you’re not being silly and using Cat7 or Cat8 cables) I’m not sure why you think this video is relevant to the current discussion. Could you clarify?

My post did not refer to Ethernet cables, but to the test of @Graeme_Finlayson with the 1 kHz sound at -120 dB. He expected that noise in the digital section of the setup will translate by audible noise in the analog section.

This was what you said:

This was the reason that I posted the video that backs my claim.
I just did what you asked me to do.

More about the consequence of noise on the performance of the DAC in this quote from a review of a device by Golden Sound:

“You don’t want noise from your source causing your DAC to perform poorly. Noise can have a direct, audible effect, such as hearing GPU-whine through your headphones/speakers, or it can have indirectly-audible effects. For example, causing clocks in your dac, or other circuitry, to perform sub-optimally.
Some dacs are more immune to this, and some will even have full galvanic isolation to in theory prevent any noise getting through entirely. But many smaller dacs are much more susceptible to it.”

Indeed, yet that is the topic, so please stop posting links to unrelated articles.

@Archimago weighs in:

If a ground loop in your ethernet cable caused unrecoverable data problems, then this sentence might look like this:

If a ogdrnu ploo in oury terehten labcesadcue bvuarcreneloe atad pelbmros hn,et isthectnense mgthi ookl leki ist:h

…and so would the rest of this page. But here you are, reading perfectly-transmitted data sent through all sorts of crappy cables and network equipment for thousands of miles between my keyboard and your screen.

The engineers who invented networking protocols knew what they were doing. Let’s give them the credit they deserve.

4 Likes

Ground loops don’t cause unrecoverable data problems (TCP takes care of this), but they can allow noise and hum to find their way into places that they shouldn’t.

Shielded ethernet is fine in the purely digital realm where appropriate - it’s not fine to use it where there will be susceptible analogue devices at the end of the chain.

I’m not saying there will always be problems, but why introduce the risk by deploying shielded cables where Cat6 UTP is the absolute maximum required to get the job done?

2 Likes

I’m not aware of any analog devices that are part of an Ethernet connection.

Well, any DAC with an integrated streamer board would be a candidate.

1 Like

If you connect a shielded ethernet cable (which is grounded at both ends) between a switch and streamer, which then connects to a DAC via anything other than TOSLINK and then onto a pre-amplofier/power amplifier you likely have (depending on each component’s grounding arrangements) created multiple paths to earth of differing impedances along the chain.

This is how to create ground loops which can ultimately produce low level noise and hum all the way to your headphones or speakers.

Remedying ground loop noise can be an exercise in hair-pulling frustration.

My advice to anyone is not to add anything which increases the risk of noise or hum. So, don’t add a shielded ethernet cable (which you think is better because the hi-fi salesman told you so) which provides additional ground connection and increases the risk of polluting the ground planes of components with noise. Use a standard UTP cable and exploit ethernet’s inherent galvanic isolation and differential signalling.

6 Likes