MQA first unfold in Roon? MQA? [Delivered in 1.5]

As I understand MQA, that analogy is not relevant. Photographic grain is random and therefore cannot be corrected, only filtered as you suggest. But in photography, there are other errors that are completely deterministic and described by a transfer function, and that can be corrected by applying the inverse transfer function. Adobe Lightroom includes such correction for the lens, correcting for geometric distortion and chromatic aberration. The lens is at the start of the process, like the ADC, and errors already in the data stream need to be removed. (Some cameras include this software processing these days.) Lightroom also includes some corrections for the limited transfer function of the printer, such as limited gamut, which is at the end of the process like the DAC and can be pre-compensated, by using profiles for printers, inks and papers in combination.

These are well established and non-controversial techniques in commercial software, and fit in the workflow in a manner similar to MQA.

3 Likes

No, my film grain analogy is apropos. To keep this response short, shifting pre ringing to post ringing and applying a slow rolloff filter for time domain performance while leaking alias products, for example, are not inverse transform error corrections. Those are trade offs.

AJ

So you reduce MQA down to application of apodising filters ?

If so, that was answered very clearly in several Q&A.

1 Like

No, I used as illustrations several trade offs that MQA makes. They are well documented.

AJ

Thanks for clarifying that.

I prefer the lens and printer analogies as more akin to what MQA is actually doing.

You can “prefer” to think that way – but believing that MQA processing is the equivalent of inverse transform “corrections” is not accurate. Rather, it seems to suit the MQA mystique that some people really want to believe in.

AJ

1 Like

What do any of those measurements have to do with MQA?

Please don’t lecture me on what I believe. Apply the notion of preference to your thoughts and comments too if you are just referencing other people’s work. For they require the reader to “believe” as well.

All MQA devices use a standardized digital filter. The measurements show the time and frequency domain characteristics and trade offs of said digital filter. Highly relevant.

AJ

Impressive. You must have measured ALL MQA devices then.

Also highly relevant I think

1 Like

This statement is absolutely wrong.

1 Like

I am glad that you backed up your assertion with evidence. But, anyway…

Do you think it coincidence that the Mytek Brooklyn employs an MQA digital filter identical to that in the Meridian Explorer2?

Footnote 2 :This filter is identical to that used in Meridian’s MQA-capable Explorer2. See fig.4

AJ

For those who are interested in stereophile measurements, there are also:

Filter selection still applies. My point is that you should not have to “interpret” the file before you call the mqa lib, the lib should do that for you.

@joel: Can you point us to evidence otherwise? Looking at the DF Red for example, the ESS DAC doesn’t really know anything about MQA, it’s “standard” filters are selected by the controller (the only part that the firmware upgrade affected) by the MQA stream. I think this is an interesting feature: to select these filters via the MQA stream, but there’s no special MQA filter used. This has been studied in many places, and in fact some experts’ opinion is that the filter choices are not particularly good - look for mansr posts on CA.

I don’t think anyone outside the company really knows this–and I’ve read, with comprehension, most of the MQA critiques.

I can’t speak for @Joel, but it’s reasonable to assume that he also can’t speak for himself. (I’m writing this without consulting him.) It’s reasonable to assume, first, that he has first-hand exposure to MQA’s inner-workings and, second, that he may not be allowed to reveal what he knows. We should thank him for the clue.

I think you’re making the argument perpetuated by Brian Lucey. I agree it has merit–especially from the perspective of a skilled and principled recording engineer. The best engineers are likely to be familiar with the limitations of digital technology and to have accounted for them in their work, producing the result they wanted–not some unfortunate compromise to bad technology. But while the argument has merit from, as I said, an engineer’s point of view, it seems to me a bit slick from the perspective of the high-end audio industry. Many CD players have been sold over the last couple of decades on the notion that they fix digital errors, on one end or the other of the recording chain–many by companies whose principals and reps now oppose MQA. Indeed, “fixing” timing errors has, I’d venture, been the dominant theme of the last couple of decades in digital audio–more if you include jitter.

This is indeed a sort of philosophical problem with MQA, IMO: That 2014 paper by Stuart and Craven lays out a worldview in which there’s a clear distinction between archival formats and distribution formats. While they never mention MQA, they imply that the technology they’re describing is for distribution, not for archiving. But if I recall, the article doesn’t address “mastering” in the broad sense, and it seems to me that MQA is doing a bit of that–of remastering. It’s reasonable that the engineers who produced the original masters might not want that. It’s a clash of digital-audio world views.

Pretty much @Jim_Austin.

2 Likes

The first unfold is surely proprietary and unique. The rendering not so much, at least not in the dragonfly device, and very much doubt that it is anywhere else. Choice of filter applied post first unfold is surely creative. But the question (as I understood it anyway) was whether those rendering filters where MQA specific and the answer to that in the DF case is no.