MQA software decoding in Roon

Anders, the complaint seems to me to be:

  1. That there is no full-decode solution in conjunction with arbitrary DSP, such as room correction. This could, of course, occur in an MQA certified DAC, but such a product does not exist.

  2. The limitations placed on a software decode.

2 Likes

If DSP functions are going to hinder a full decode from the software point of view, this can be easily bypass, thus a full decode can happen in Roon

What kind of limitations are placed in a software decode?

To quote Bob:

“Full software decode is not possible because the DAC must be known and characterized. MQA is an analog to analog process.”

My understanding is that Roon is different in that it recognizes (and certifies) the DAC. It would allow Roon to do a software decode specifically to the end point DAC, as long as that DAC is certified. The DAC would not need the hardware to do the decode itself to still have the full analogue to analogue certified chain.
Using the RAAT setup I would guess Roon provides a sufficient ’ closed’ system.

Time will tell. I hope Roon takes all it needs to maximize our experience :).

1 Like

I’m sure they’ll do their very best, but remember they’re a small company still and have a lot of “urgent” “top priorities” we also demand fixes to :wink:

Which is why I suggest they take all the time they need.

1 Like

Not so easy if you run a Devialet amp…

… or if you have Avantgatde Acoustic Zero 1 speakers with an internal DAC.

1 Like

If you can add DSP after the first, software decoding (which requires lossless, bit-perfect input to get to 88.2 or 96 sampling rate) but before handoff to the DAC for supposed processing to the higher rates (e.g., 176.4, 192, 352.8, 384, etc.), only the content up to 48kHz would be processed, and the original content would be completely gone by the time it gets to the DAC (i.e., it has been upsampled, filtered, etc. - NOT a lossless process). All that is left is the MQA “signature”; i.e., some code that says, “this was originally an MQA stream”, to turn on the “authenticated” light.

Questions that pop into my head:

  1. How does the content below the noise floor that is required for the final MQA DAC processing survive processing (e.g., DSP, upsampling, etc.)?
  2. If it somehow survives, does the DAC just “tack on” that above 48kHz content - because it has not been DSPed - it can’t, the software part only extracts up to 88.2 / 96. So you would have DSPed content up to 48kHz, and untouched content above 48kHz.

Makes a lot more sense to me if the original, software unfold actually contains “music” up to 48kHz, which can be upsampled / filtered, etc., and the DAC processing is literally only additional upsampling / filtering against the results of the DSP, not adding any additional musical content above 48kHz.

Just my thinking and, as you say, I may be “incorrect”.

MQA does incorporate at least one type of in band coded signature/watermark. This has been documented. Some are concerned that it could be used for draconian DRM purposes. At the very least, though, it appears to be used for MQA identification. And that signature/watermark seems to be quite robust if it can pass through intermediate DSP so that an MQA renderer still can identify MQA audio after the first unfold and intermediate DSP.

This is just my supposition, but I would bet on some sort of spread spectrum scheme. As long as the signature/watermark were very low bit rate, spreading it across the 2.3 Mbps of typical 24 bit 48 kHz two channel MQA should make it inaudible and resilient against many forms of intermediate DSP. The signature/watermark would not be sufficient bandwidth to carry actual audio content, but it could carry signaling information that would allow for informed upsampling. The MQA literature screenshot posted in this thread earlier today suggests as much with its parenthetical reference to “containing buried information on how to proceed.” That sounds like a set of instructions for intelligent processing/synthesis, not additional latent audio content waiting to be decoded.

AJ

1 Like

The bit rate of MQA is actually capped around ~1.5Mbps and not 2.3Mbps as in a conventional uncompressed 24/48kHz PCM (24-bit48kHz2). I remembered I posted this graph sometimes ago but not able to get the answer I’m looking for. May be you can help me.

Looking at the graph, one can tell immediately that MQA bit rate is capped throughout the sampling frequencies whereas the conventional PCM bit rate increases proportionally with the increased of sampling frequencies.

Now my question is why is the bit rate for MQA capped constant throughout the sampling frequencies and not increased? Obviously if the sampling frequency increases, there’s more information it needs to carry, thus the bit rate has to increase!

This bring me one conclusion, anyone who look at the graph will straight away tell you there’s a ‘compression’ going on, and there’s is a lot of information that is being ‘thrown away’ because there’s not enough bit rate bandwidth to carry it in the first place especially at the higher sampling rates.

If anyone here can indeed offer me a good explanation I will be very happy. Case close!

MQA content is delivered by a 24/48 or 24/44.1 flac container. Regardless of the level of unfolding the amount of data delivered is still the same flac file (which assuming reasonable compression would have a data rate of about 1.5Mb/s).

When the flac file is uncompressed then the data rate is commensurate with the 24/48 or 24/44.1 container.

The graph you are referencing has nothing to do with the data rate after unfolding. Its intention is to show the efficiency of delivering high res content using MQA packaging vs just shipping the native high res data. In other words, MQA is good for streaming because it’s efficient.

1 Like

@AMP I still don’t get it, if the bit rate to start with is low at high sampling rates, where is the additional information that it needs to fold back exactly to match a conventional PCM?

Is there’s a ‘magic’ that is buried inside the noise floor that it can reconstruct all the information back to match a conventional PCM? You need sufficient information to reconstruct back, in this case, it doesn’t. It is like ‘magic’ to me.

No, it’s just a very sophisticated compression algorithm. Not truly lossless, but sacrificing the part of the bitstream that simply cannot be resolved by any typical DAC.

If the system doesn’t behave in a ‘truly lossless’ manner then it make sense to me. I’m not trying to fault anyone or the system, just need the understanding what is going on here.

It’s a bit of lossless compression applied so the bitrate is lower when transferring the encoded MQA file. MQA is not digitally lossless (meaning you won’t get perfect reconstruction of source bits)…but that’s why they emphasize it’s an analog-to-analog process. The compromise is losing a bit of digital data for the sake of getting a better analog reproduction.

3 Likes

MQA is wrapped in a FLAC container, the FLAC container is lossless. The degree of lossless compression by FLAC (the container) is very limited since MQA itself is already so compressed, there’s no much FLAC can do it losslessly. I see this as easy way to deliver MQA to the masses.

I like @rovinggecko’s take on all of this and @AndersVinberg is doing a fine job of reminding us of relevant references in the published material.

It seems technically feasible for Roon to perform DSP between the first and the second/third unfolding. The exact nature of the second/third unfolding is sort of beside the point. It will remain whatever it is. What everyone (I think) would like, in the best of all possible worlds, is the ability to convolve for Room EQ in Roon while enjoying full software decoding of MQA.

If full software decoding in Roon is not possible, which would seem to be a commercial decision by MQA rather than a technical one, then some capacity to do DSP after the first unfolding would seem desirable. That may mean that MQA lights do not go on, so be it.

There are doubtless reasons why the devs cannot tell us more. I suspect that the negotiations are sensitive and ongoing. They certainly seem to be the highest priority for Roon.

If it turns out that users must choose between MQA and DSP then I think that would be a bad outcome for MQA.

2 Likes

It has been stated that DRM has no,place in MQA.

But that process still has some relevance in identifying and authenticating a ‘real’ MQA file rather than something created on a laptop in a bedroom somewhere. So while it is not DRM in that sense, it still needs to be there for authentication. That is not a bad thing.