Roon 1.8 sound quality change?

Extract of the relevant section…

Signal integrity impacts many electronic design disciplines. But until a few years ago, it wasn’t much of a problem for digital designers. They could rely on their logic designs to
act like the Boolean circuits they were. Noisy, indeterminate signals were something that occurred in high-speed designs – something for RF designers to worry about. Digital systems switched slowly and signals stabilized predictably.
Processor clock rates have since multiplied by orders of magnitude. Computer applications such as 3D graphics, video and server I/O demand vast bandwidth.
NB: one can add high end digital audio
Much of today’s telecommunications equipment is digitally based,
and similarly requires massive bandwidth. So too does
digital high-definition TV. The current crop of microprocessor devices handles data at rates up to 2, 3 and even 5 GS/s (gigasamples per second), while some DDR3 memory devices use clocks in excess of 2 GHz as well as data signals with 35-ps rise times.
Importantly, speed increases have trickled down to the common IC devices used in automobiles, VCRs, and machine controllers, to name just a few applications.
A processor running at a 20-MHz clock rate may well have signals with rise times similar to those of an 800-MHz processor. Designers have crossed a performance threshold that means, in effect, almost every design is a high-speed design.
Without some precautionary measures, high-speed problems can creep into otherwise conventional digital designs. If a circuit is experiencing intermittent failures, or if it encounters errors at voltage and temperature extremes, chances are there are some hidden signal integrity problems. These can affect time-to-market, product reliability, EMI compliance, and more. These high speed problems can also impact the integrity of a serial data stream in a system, requiring some method of correlating specific patterns in the data with the observed characteristics of high-speed waveforms.
Why is Signal Integrity a Problem?
Let’s look at some of the specific causes of signal degrada- tion in today’s digital designs. Why are these problems so much more prevalent today than in years past?
The answer is speed. In the “slow old days,” maintaining acceptable digital signal integrity meant paying attention to details like clock distribution, signal path design, noise mar- gins, loading effects, transmission line effects, bus termina- tion, decoupling and power distribution. All of these rules still apply, but…
Bus cycle times are up to a thousand times faster than they were 20 years ago! Transactions that once took microsec- onds are now measured in nanoseconds. To achieve this improvement, edge speeds too have accelerated: they are up to 100 times faster than those of two decades ago.
This is all well and good; however, certain physical realities have kept circuit board technology from keeping up the pace. The propagation time of inter-chip buses has remained almost unchanged over the decades. Geometries have shrunk, certainly, but there is still a need to provide circuit board real estate for IC devices, connectors, passive compo- nents, and of course, the bus traces themselves. This real estate adds up to distance, and distance means time – the enemy of speed.

2 Likes

Maybe it is time to put once again some rocks onto the DAC. I’m sure it is not wrong if you put another on on top of your Roon Core.

http://audiophile.rocks/obscurumultima.html

2 Likes

But then it closes the lid and goes into sleep, for sure it becomes silent hahaha.
I do put weight on audio equipment, but only on top of the subwoofer.

I think it’s time to put some rocks in a tumbler and mix up a batch of lethal margaritas. Guaranteed sonic bliss…

Yes, I know all this. This is stuff I learned in school, all very true for circuit and pcb design, but what has it got to do with software?

1 Like

10 grant for a rock? This is, I run out of words …

Okay, what exactly are you trying to tell me?
I just knew you wouldn’t respond conclusively.

To get back to the topic:
Comparative measurements of well regarded DAC analog output under differing digital input scenarios do exist (Stereophile, ASR, Archimago to name a few).
These show generated artifacts being below about -120dB.
Now look at most amp specs and measurements, and you’ll find they are masking these DAC artifacts with their noise floor being at least 20dB worse. This is by a factor of 100, since human loudness perception is sound power related.
Now listen to your tweeter up close, and if you can’t hear any amplifier noise, how can you then hear these artifacts?
And if you do hear amp noise, get a better amp before postulating anything publicly in a forum.

6 Likes

It has to do with CPU or GPU load, sharing of processes, I/O interrupts on high frequency buses. As Tektronik points out, sensitivity to jitter increases with carrier frequency of digital signal flow. Hopefully Roon Team can provide some insights there about what is changing, but clearly two processes doing the same thing but with different efficiency, synchronicity of I/O and bus transfer of data, may see for instance different jitter.Less jitter can improve treble transients - maybe explaining the need to balance this by a small boost on very low frequencies.
It also shoud be especially noticeable between the two channels, whereas the human ear is extremely sensitive to inter aural time delays. Serious researchers have measured 2 microseconds detectability of L/R delay, peaking around 2 kHz… which can help explain why aging musicians, sound engineers or audiophiles can still be very sharp in assessing sound while their 20 y ability to hear say 20 kHz is no longer there.

1 Like

Let us not forget that on the other end of the digital flow, the DAC, the rate of information is fixed. At 384 kHz for instance, it must receive one sample exactly every 1.3 microseconds.

Or actually two samples, one for each channel.

Uuups, wrong by a factor of 1000
1/384000=2.6E-6 which are micro seconds

1 Like

If you are so worried about CPU load, GPU load and sharing processes why do you use any DSP and Upsampling then? Why use Roon? Why not use a commandline player with next to no overhead at all? If you have read the articel you might have understood that problems indeed rise with higher frequency. Many receiver chips nowadays do work better at 96kHz or 192kHz than they do at 384Khz, that is measurable, in fact I did. It is, like the article says more prone to jitter, more prone to emi/rfi and much more care has to be taken with proper impedances of your transmitter/cable/receiver system. I never got it why people are upsampling to 384kHz, probably because they believe higher numbers must be better, well that’s not the case in many situations. Maybe people like the sound of added jitter, it often sound like more detail and air but in the end that’s artificial.

3 Likes

It isn’t real. It’s a goof. Did you notice what the rock actually resembles?

You do realize this is satire (of the highest order)?

No, I closed the page after I saw the price. Maybe this particular page is a goof, however I saw so many voodoo stuff in the last decades. I believe you will always some lost souls willing to pay a fortune for some very weird components. I invested a lot of money in my equipment, but claim for myself not using any voodoo stuff.

It has been a long time since I have not seen a jitter spectrum graph. Long ago there had been an interesting comparison of these graphs in Stereophile. Did you spot it too / It showed a clear correlation between perceived musicality and jitter spectrum, that was not explained otherwise. Was it so interesting that it was not published anymore ?
Like very interesting spectral graphs of harmonic distorsion measurements of speakers at different acoustic powers: 75 dB at one m fine, what about 85 dB or 95 dB ? This type of measurements, that shows how distorsion increases very rapidly with power below 100 Hz for so many 8 or 10 inches speakers, are a much less good selling point for small boxes than the near-perfect frequency response that is so good with many of these that one might wonder what’s the point of going for anything more sophisticated. And yet a reproduced piano on these does not sound like a real piano… at all.

I am convinced that the commercial pressure on the hifi press puts pressure on not providing the most critical measurements, for the sake of marketing eternal progress - and this is an international consensus that serves the industry rather than helping the discriminating consumer.
If one reads the press along the year, today’s speakers blast out anything 20 years older. And digital audio was supposedly perfect from day one. Two statements we all have an idea how valid they are…

Oops I wrote too quickly sorrry for the typos that I edit straight noz. This is microseconds also for the human ear. The reasoning is unchanged.

Sorry (no, not really) to nitpick once more, but this time only wrong by a factor of 3.6 and assuming human auditory angular perception is able to discern 1 degree, then this equates to a temporal resolution of 7.3 microseconds.

But in binaural localization, things are a little more complicated actually, since interaural time delay works only up to frequencies of about 1.5 kHz (pinna distance vs. sound wavelength).
At around 1.5 kHz interaural level difference takes over for localisation purposes, further complicated by frequency response shaping through the pinnae.
Of course, there’s some overlap between about 1 to 2.5 kHz also…

1 Like

I remember, but as you say: long ago.
Even many cheap DACs have virtually flawless jitter spectra these days; I’m glad I didn’t invest in one of these Mark Levinsons and the like back then…
Even the analog output stages of some el cheapo DACs outperform many yesterdays flagships…

… whatever makes you happy, but keep your conclusions to yourself, if you’ve got no proof, please!

2 Likes

I am not worried with the 4 cores i7 at 2.6 GHz MBPro I am using now.

As I said earlier in the discussion, if I avoid the dynamics computation to also run, I get 10-12 % CPU load for each of Roon and HQplayer. And the sound is detailed but fluid, neither systematically rude or smooth.
A few years ago I used a Mac Book Air that had only 2 cores I4 at 1.something GHz and I did have a harsh digital sound when I used long interpolation functions sinc-M that is theoretically perfect within 100 dB but computer-intensive. There was a lot of CPU-induced jitter.
Now I am happy with the new MBPro, as I just confirmed the sound is very fluid. Even slightly more so with 1.8 now.

I believe this very high interaural temporal sensitivity of human ear is one of the reasons why oversampling beyond human perception limit around 20 kHz at best is justified, while not confusing the sampling and the accuracy of the reconstructed waveform… If 2 microseconds were the sampling rate, the Nyquist frequency would be… 250 kHz, far above the 40 kHz limit for 20 kHz. That I believe is a clue for using 256 or 384 kHz rather than say, 96 kHz, to feed the DAC with an already finely resampled waveform - using a much more accurate operator than what can be done internally to the DAC, unless it contains a full box of FPGAs.

We are here on this Roonlabs community to share experience, understand what others are reporting, make useful remarks and hopefully contribute to suggesting actions from Roon Team that provides more guidance on an obviously complex, multi-parameters topic. Many audio forums are poised by useless church fights; ambushed snipers, and sometimes hidden commercial interests. Hopefully there are none here and we should all continue to keep being critical, but constructively.

I have a suggestion for Roon Team. Could they produce a jitter measurement and ideally a jitter spectrum comparison between 1.7 and 1.8 ? Jitter seems a good candidate to explain a variety of observations and concerns and it would really help if you could assist in consolidating those technical arguments. There is no magic there, for sure everyone has some bias, but statistically there are differences and we lack more hard science, some lab measurements - of the right phenomenons as we are clearly beyond « bits are bits » circular arguments. it would be great to have more insights from the provider !

1 Like