Yeah yeah, if you think so that’s alright.
Here is the well respected (and qualified expert…) Rob Watts (of Chord) talking about how using an ASIO driver may sound different to equally bit perfect WASAPI driver… both capable of bit perfect playback obviously (a little like your ALSA vs CoreAudio in Exclusive Mode example):
This has got nothing to do with bits being affected, as you’ve already said earlier in the thread.
A qualified and respected DAC designer like Rob has the unique position of understanding both the digital side and what can potentially affect the analogue performance side… something a software-only engineer (for example) may not fully understand…
And this is even just looking at the example of ASIO vs WASAPI (in exclusive mode) potentially sounding different - on the same system of hardware and same OS and same everything else… In you’re example, you are comparing different OS and different hardware…
Hopefully this doesn’t upset any Chord DAC owners that claim to know more than the Chord designer himself… Somebody earlier called them charlatans? They’re always grumbling and waffling around the place though…
I love talking to the designers of the gear we all love and use. There’s plenty to learn and those respected designers (qualified experts) are often still learning things themselves, as they will openly and humbly admit… even the ones designing already state of the art gear…
Hope this helps.
If you are talking about me (I did grumble), no I wasn’t referring to Rob Watts, like his work, but there are plenty of charlatans out there.
A very close friend of mine ran an industrial cable company, supplied anything from cellular to power utilities, he had a customer who bought his standard cables and relabeled them as audiophile at a 100X price hike. And I’m sure those got rave reviews.
I don’t know why HDMI cable would produce different picture quality either. If there’s noise the picture should produce broken blocks, instead of change of picture quality (more vivid, more details, better contrast…)
But I do confirm sometimes change of HDMI cable can produce brighter screen. That I don’t know why.
I have asked this question in another AV forum. No one can provide me an answer.
Why not do a blind test, see if you can tell if there’s a difference, and which one is which?
Probably because one cable is old (1.3 or lower) and the other is new (1.4).
1.3 cables are still widely sold and usually perform very well though they theoretically do not fully support 1080p resolution. 1.4 cables do support higher resolutions.
Also be aware that most manufacturers of AV equipment do not have their HDMI implementation fully licensed because the licensing is prohibitively expensive, even for big companies. Sometimes this results in behaviour that is not fully up to standard, though mostly in peripheral functions like CEC.
As of 2009 HDMI cable vendors are forbidden to mention the version support of their cables, don’t ask me why.
Nobody seems to have mentioned the possibility of lack of volume matching, which can have a huge perceived difference. Is this a possibility in these setups?
What i miss in this discussion is the respect to Hans Beekuysen’s statement:
We measure, that nothing is broken or badly developed. For the quality trust your ears
In my setup, like follows:
Internet router with mesh technologie, no computer related source hardwired to it, only Music related units are hardwired with SupraCable Cat8 cable to the switch Catalyst 2960 Series, Nucleus by Roon, NAS, USB Cable to Dac, changing all the time with Amanero USB input aswell as the prebox S2 digital to the ATC Preamp SCA-2 and the Speaker are ATC SCM-50 ASLT.
When I swith the roon core to my Macbook pro late 2014 highest specsall UBS ports closed with AQ jitterbugs, SSD drive and back to Nuclues, there is an obvious difference in soundquality.
Whatever is allowed to be, I don’t care, as long I can proove it with my friends and family. My system is in another room then my listening situation is, so nobody can see, weather the mac or the nuc is onduty.
And believe me, if there is no audible difference between mac and Nucleus by roon, I would never ever spend 1.5 or 2.5 K€ on a unit without benefit?
NO WAY, never ever!
Yes, volume makes a huge difference. And DAC comparison should indeed be volume matched. However, the subject here is Roon Bridge, which by definition outputs digital. When DSP engine and volume leveling are turned off and Roon is configured properly such that OS mixer does not change the audio data, Roon outputs bit perfect data, which implies the volume is the same for the same DAC.
Never ever witnessed that and I have used a lot of different hdmi cables over the years. They either work or produce artefacts or don’t hdcp correctly resulting in green or black screen nothing more. Power on the other hand can possibly effect the sets picture through the ability to drive full brightness of the backlight and cause brightness differences. Perhaps bad cables cause a power drain in some cases and this is what’s happening? It all goes against the whole principal though as TVs are purely digital for picture processing and display.
I don’t think a blind test on video quality would be all that successfull :)
Nice, I’ve got the same preamp paired with the SCM100ASLT
Were you able to demo the Nucleus in your setup prior to purchase?
Sound Quality not a good as expected
I can’t check at the moment, but aren’t there still various options for how volume is handled in say, Roon Bridge on a Pi? Fixed, device etc?
wow the SCM-100! This big boy is to large for my room.
Yes, my company got the Nucleus distribution
Now, I have to say, I didn’t want get rid of the Nucleus…