Assuming all DSP is off in Roon, there is no difference in what Roon 1.5 and Roon 1.6 is handing off to HQPlayer. None.
How do I know this?
My DAC, the PS Audio DirectStream, has a file I can play that will tell me, on its display, if the track is being played is being played bit perfect. In other words, not molested at all. To test, I turn off all DSP in Roon and all processing in HQPlayer.
I have done this test multiple times with Roon 1.5 and each time the track plays bit perfect as the DirectsTream display tells me so. when I turned on and used any DSP function in Roon, the track does not play bit perfect according to my DAC display. If I turn off DSP in Roon but turn on upsampling in HQPlayer, the track does not play bit perfect according to my DAC display.
So, there is no difference between Roon 1.5 and Roon 1.6 when it comes to HQPlayer. Any difference you perceive is attributable to expectation bias.
Your statement of no difference between Roon1.5 and 1.6 when it comes to HQplayer is misleading, since in the particular setting I used. It is incorrect. If you read my different explanations you understand why, but let me recap for clarity.
The differences I perceived when moving from 1.5 to 1.6 were in configurations with HQplayer AND Roon DSP ON.
It has already been discussed that this first 1.6 release truncates filter definitions by rounding to the first decimal all the Q filter settings, with a slider and a one-decimal number display. Then I could not listen on 1.6 with the same usual filters I used on 1.5. That alone is factual and can explain the difference I perceived.
@ windersScott Winders Scott, two audio streams that are âbit perfectâ do not necessarily sound the same due to inter alia, the effect of time domain errors such as digital jitter. Other issues also come into play such as noise level and the noise spectrum. If âbit perfectâ was a sufficient solution for âperfect soundâ from digital audio devices, then the many of the simplest and cheapest devices would sound identical to the most complex and expensive. This is demonstrably not the case though I certainly wish it were so.
We are talking about the same exact hardware platform with the same power supplies and the same clocks. Even the same data on the same drives. The only change is a software change from Roon 1.5 to Roon 1.6. The problems you describe are pretty much hardware related and, since the hardware isnât changing, is not relevant.
Inter alia? You arenât writing an academic paper here so there is no reason to use a latin phrase of which most people have never heard or read.
No pity at all. There is a difference between dumbing down and not using a dead language phrase only appropriate for legal briefs and college law classes.
Roon 1.5 and Roon 1.6 outputs are bit perfect and identical (DSP unactivated). I have verified this on a 96kHz/24bit test track. Correlation depth computation using Audiodiffmaker between output and original tracks shot it without any doubt.
Bits are Bits.
Unfortunately:
Software is software. The same âbitperfectâ asynchronous transport can be achieved in various ways: USB with ASIO driver, USB with Coreaudio driver, RAAT Roon protocol to Roon endpoint, other over the air protocols such as Airplay, Devialet Air etc. On arrival, the bits are the sames, no doubt, but carried in a different way if you like : packet sizes, error correction, master/slave configuration, all. For USB, the job of the USB decoding chip is to collect all this bits arriving, buffer them and clock them as accurately as possible to send to the analog converter via I2S (typically).
USB drivers also need computer resources to behave as expected; if CPU/memory/network is loaded, drivers have to compete, buffers are here to make sure the stream is not stopped, but the way the transport is actually done can be impacted. Stil, bits on arrival are the same. Maybe Roon 1.6 is more demanding or has a different CPU/memory/network footprint than Roon 1.5?
So one would say: who cares about the quality of transport if bits are anyway put back in order?
Two answers:
answer 1, merely an hypothesis. Even if in principle the asynchronous reception of bits, bufferisation, and generation of the synchronous i2S stream are independant from each other, one cannot rule out side effects on the decoding chip. Itâs all analog in a chip.
answer 2, which is solid with my experience: current leaks, all kind of noises can run through the USB wires (or Ethernet wires), and find their way to DAC clocks, converters etc⊠Isolations are never perfect and not blocking all types of noises. Thatâs why most of the people perceive an improvement by changing original power supplies to linear power supplies.
Wow I just realised we are writing here episode 35467 of âBits are bits, what do you expect ?â.
@Ian_Richards terrific setup! To be honest I donât really know what to answer. If you are certain that 1.5 sound marginally different than 1.6 then it has to do with the different software footprint of 1.6 making USB transport different than 1.5. WHat is for sure is that the bits sent to your F1 USB XMOS receiver are identical in both cases.
Sorry about what ? We are all facing a new release, a new interface every so and so and learning to find a way around. Learning along the way is important. But it has to be done with an open mind, and rationally, without a priori or denial. Along this thread and from a first negative experience with1.6, I have learned to disable DSP while maintaining the exact bass EQ I had optimized and even progress using latest HQ player desktop features.
And sharing this knowledge might be even more important.