On a physical level, the MQA file is being decoded to 88.2/96kHz. This is a decoding/decompression process, not an upsampling process, so just like the HDTracks recording, the decoded output looks like an 88.2kHz/96kHz PCM stream.
If you have an MQA DAC, then there is a little bit of extra information buried in the noise in that stream that’s used to drive the rendering capabilities of the DAC. If not, then Roon instructs the MQA decoder to omit that information, since there’s no use in including those bits if the DAC isn’t going to understand them. A regular non-MQA FLAC file will never contain the rendering information needed to drive an MQA DAC.
As for what may make MQA recordings different than what you can buy on HDTracks–
My understanding of the MQA encoding process is not as thorough as my understanding of MQA playback chain considerations, but my understanding is that the MQA includes techniques that correct for details of the recording/mastering process, and that these techniques are not available for non-MQA masterings.
Some of these techniques are straightforward, good ideas–compensating for undesirable characteristics of particular analog tape machines, or early A2D converters, for example. When techniques like this are used, it may result in a master recording that is higher quality than what was available to HDTracks.
MQA also has the ability to authenticate “MQA Studio” vs “Valid MQA”, which can provide a hint about the provenance of an individual recording.
No. De-blurring takes place both in the encoding (correcting on the recording side of the chain) and during rendering (correction for the DAC).
The MQA Core Decoder integrated into Roon does a few things:
Authenticates the provenance of the original stream. This uses cryptographic techniques to ensure that the bits weren’t disturbed between the mastering environment and Roon.
Decompresses the MQA data to 88.2/96kHz.
Roon also does some other stuff:
Manages information about DAC capabilities in order to design the most appropriate signal path for all situations
Acts to preserve/restore rendering instructions when performing DSP, so that we don’t interfere with DACs which can perform MQA rendering.
The “thingy” you mention is not happening in Roon. That is what is called “Rendering”, and that is what MQA DACs do.
Roon doesn’t perform rendering, but it does contain code which preserves/restores the instructions required to drive the renderer. We call these instructions “MQA Signaling” in Signal Path:
So while Roon is not doing anything with that information, we are respecting + preserving it to let the DAC do its part at the end.
Maybe. The stuff that we do around DSP to preserve/restore signaling information is news to many people–the “old” mental model treated MQA and DSP like oil + water, and assumed that they didn’t mix. So if you are used to thinking about it that way, it may be time to adjust.
My understanding is that this is not correct. The MQA encoding process can/does remove bits that the MQA algorithm thinks are “unecessary”, and those aren’t (can’t be) restored. This is definitely the case if the original file was 176k or 192k or more, as the MQA process doesn’t encode any information above 48kHz, if it is there. MQA says their process is “perceptually lossless”. That isn’t the same as “lossless”. Whether it matters, whether it changes the sound of the result is a different discussion.
Read that statement over again–I was only describing the decode process in those terms–i.e. that when we decode a 24/48 MQA file to a 24/96 MQA Core stream, what is happening is decompression, not upsampling.
My understanding of the encoding process is similar to yours–MQA encoding is not data-lossless. If it were, it would be possible to recover the original bits of the master from the MQA file…and that is clearly not what they were going for here.
Am I right to say, if the ‘rendering’ instructions are buried together with the first unfold, doing any form of DSP will destroy the rendering instructions to the renderer?
Now, with this in mind, I’m trying to understand how the renderer works. If we got an unfold stream modified by the DSP, so it means it is no longer bit perfect; this stream goes to the renderer, together with a separate path, called the ‘renderer’ instructions. I can’t figure out if the unfold stream get modified by the DSP, one can still extract some information using ‘renderer’ instructions? This is the area I guess many of us still don’t really understand how the mechanism works.
I think all of the building blocks are out there–they just need to be put together in the mental model.
Rendering instructions can be embedded inside of a 88.2/96kHz PCM stream–this must be true because it’s possible to drive a renderer-only device like a Dragonfly Black with an unfolded 24/96 stream and see it light up purple to indicate that it is rendering.
The way Roon works is actually exactly how we lay it out in Signal Path, except that we refer to the rendering instructions as “MQA Signaling” in the user interface:
Now I understand Roon uses ‘extract’ and ‘re-embedding’ process to preserve the rendering instructions. Are you guys able to explain in details how the renderer works? How the renderer instructions tell the rendener to do the ‘rendering’ effects.
If the MQA encoding was able to remove some of the artefacts (see Brian’s note above) caused by the recording/encoding chain then it is quite likely that the MQA result will not be the same even when taken from the same source. But that is the point and what MQA is trying to do.
Have a read of the white glove treatment of the Toneff album on the MQA site. It is quite enlightening w.r.t. what one might need to do to remove the effect of the equipment used at the time (e.g. 1980 digital tape recorders which primitive filters)
Many master tapes have good records of what the recording chain was and many of the bits of equipment can and have been profiled so MQA know what effect they have on the signal.
Think of lens correction filters as an analogy. Your photo is the digital HD Tracks master. But the image may not be exactly as the subject seen. It shows the effects of the lens you used to take the picture. The lens correction filter helps you get your picture looking more like the real thing you thought you snapped.