Context first: I have a test file I’m trying to play. It is intentionally at a high level: alternately two samples at + 0dBFS, then two samples at -0dBFS. So, all samples are legal, but the sine wave the data encode is about 3dB over the limit. It’s intended to detect how a DAC will respond to such a stimulus.
I’ve set up Roon to deliver bit-perfect data, but I’m not getting the results I expect. I’m getting results I would expect if the volume is emerging from the server lower than expected.
I have headroom management turned off, but I did check the box that says “Show clipping indicator.” I’ve disabled all volume controls. I’m using ROCK, which I think means I’m always in Exclusive Mode. There’s nothing amiss in the signal path:
That is the value to be applied if you have volume Leveling enabled on a track by track basis as opposed to a whole album playthrough. The BTP refers to measurements of true peak level also referred to as “dBTP”, I am assuming that Roon just dropped the “d”.
I’m still a bit confused by the first number. With leveling is ennabled, would they really attenuate this track by more than 29dB? The “True Peak Level” seems pretty accurate; the peak level of the samples is 0dBFS, but the sine wave it could represent would be 3.01dB over Full Scale.
Yes, because that test track is at 0 dBFS and has 0 dB dynamic range – its peak and average levels are one and the same – it would be subject to large negative gain adjustment to bring it in line with intended listening levels. Music signals, on the other hand, even the worst of the brickwall limited loudness wars, rarely would be attenuated as much as the 0 dBFS test track.