MQA disappointing

But if “the truth” sounds good, what’s the problem? Some people listen to music, some people analyse it, some people campaign against MQA.

3 Likes

Some people this, some people that. Almost Everybody wants to know if their being cheated, lied to, and taken advantage of…

5 Likes

It is not the matter whether it sounds ‘good’, the argument here is it doesn’t sound the exactly the same when compared to the original masters. If the marketing claims it is original masters quality, then it should be at least be unprocessed or left untouched. Qobuz Hi-Res are directly from original masters (final stereo mixed down). That’s the closest one can get, not with Tidal ‘masters’.

4 Likes

Even with the anti MQA posts I was still onboard. I was like, who cares if it sounds better. But better than what? They claimed it sounded it even better than the original hires files. Removes blurring, gets truer to the source.

For me, I got that feeling when first listing to 192k 24bit Qobuz file. Think it was a Stevie Wonder album. I could hear the space around the vocals, the breath. The background was so clear. MQA sort of diffuses the sound. Vocals are smoother but not as pinpoint placement in the center. Feels a little wider. Even the soundstage seems wider but loses depth. Believe this is an effect of minimum phase filtering.

Think this filter gives a pleasant sound but with a trade off accuracy in soundstage. In HQPlayer you can experiment with different types of filters and see what effect it has. Minimum phase for me always sounded better at first for me but then didn’t like the collapsed soundstage. Eventually was drawn to Chord like filter and figured it was best to ditch HQPlayer and get the real deal. Very happy with my decision.

3 Likes

One other thing is I couldn’t tell if it was a change of plans or hasn’t happened yet. But think for MQA to deliver 192k type files, they were planning on using 96/24 that would unfold to 192k. But then they just made it seem like everything was contained in the 48/24 file, even unfolding to 384/24.

So when you hear that, you’re like wow, not only is Tidal delivering lossless, but 384/24 files. Why not get a Brooklyn dac and just on board. At least that’s how I was thinking back then.

Again, this is unfortunate but most people fall into believing it. What you see in all MQA DACs that display the sampling frequencies are actually authenticate flags (telling you the original recorded Hi-Res lossless master sample rate), not the actual sample that goes to the internal DAC. This is similar to Roon showing a detailed breakdown on signal path. Essentially, MQA composed of two sections; decode and renderer. Decode as we know is up to 96kHz and no more. Renderer is basically MQA proprietary oversampling digital filters (final de-blurring filters if you want call it) that has to reside to a known DAC chip.

There is no such thing of 192k or even 384k actual samples to begin with. 24 bit is actually ‘24 bit data’ but the actual resolution output after it get decoded is limited to 17 bit and at sample up to 96k.

3 Likes

I don’t think a wider soundstage is MP filtering or at least not my experience when switching around filters in HQPlayer. MQA definately gets a slight volume boost and some believe possibly some spacial DSPing. 2 effects that humans could interpret as better sound. It certainly isn’t closer to the studio master (sound) in any way.

1 Like

At least with my headphones the MP filters always seems to make the sound a little closer. They seemed more fun than the linear filters. The best balance of the two was the xtr filters. This is Chord type filter that I believe is a linear filter but still fun.

Isn’t MQA deblurring all just using different MP filters? They aren’t individually programmed for each song but a selection of one of 32 filters.

Do we even know if these filters are tuned for each individual dac? That was part of the marketing. Every dac had to have its own settings. That why full decoding couldn’t just be done in software.

In the end what turned me away was the fact that Qobuz actually sounded better and then throw on the false marketing claims. Think if MQA sounded better I may be able to get past the false claims. It’s the end result that matters.

1 Like

As someone who’s worked in high end digital printing shops in NY - the places that produce limited edition way too expensive gallery prints - your statement isn’t quite correct.

While many photographers do shoot raw (meaning), there are still a large number who do not, including some of those who’ve made the rare jump to household names. I’ve seen a lot of challenge tests of people who swear they can tell a print made from a jpg (compressed) file from one made of a TIFF (uncompressed) or raw file - that the’ll find all the artifacts. I’ve never seen one of those experts succeed, at print sizes of 30x45 inches. If a photo is properly exposed, so that it doesn’t require a lot of torture in the editing process, there won’t be artifacts.

There’s also the “but raw files allow you to torture them in case you got something wrong” argument, which also is wrong. Raw buys you about 3/4 of a stop of forgiveness before it becomes evident to a trained eye that the file was captured poorly. It’s easily visible in the histograms, but can be seen in prints too in shadow areas, and in areas of large smooth texture color.

Photographers capturing in jpg, lossy compression, are capturing 8 bit color. Raw files are a 16 bit color space, but typically only 12 to 14 bits are used. Then you edit on a monitor that is, if you cough up well over $10K for it, 10 bit. Oops. Most monitors are 8 bit. Double oops. How about printers? I’ve got a couple of printers that have 16 bit printing mode. If you test, they’re actually hitting about 10-12 bits. The vast majority of printers - those under $20,000 - are 8 bit. So are a lot that are over $20K.

By and large, the photographers do the editing, or their assistants do, for the fine art photographs. For adverts, there are services for professional editing, but it’s always done WITH the photographer. The analogy you made just doesn’t hold. For most sports or news, they’re all shooting jpg, because of time pressures - zero time to fiddle with file conversions, and they’re pros - they capture quality data. I’ve helped print those files at massive gallery sizes, because those photographers do get shows. Raw doesn’t really matter that much.

And then - is this analogous to MQA? - in the fine art digital printing process, one of the things done just before going to print is, digital noise is added to the image. Gaussian, not uniform. Added at a level close to perceptible for normal people (an expert will be able to tell). Adding noise raises perceived sharpness, definition, and contrast, lowers perceived noise from the digital image capture process, and in general makes the image “pop” as slightly more realistic.

I’ll break that out: Adding Gaussian noise makes photographs printed at large sizes look more realistic, sharper, better definition, lower noise.

5 Likes

MP filtering “blurs” (to ironically use MQA marketing speak) instrument placement/precision, etc. My read is that a majority of folks hear this as collapsing sound/head stage, but a minority seem to hear it as expanding it. I can’t get past how MP “blurs” the timbre and resolution of everything, so I never get to how it affects sound/head stage…

Recently we were invited to a Drum miking demo at a local drum retailer.
We went from, recording a live drummer and playing back, two mics to 8 in various stages.

At the finish they talked about getting a really cheap overhead mic, the cheapest and nastiest you can buy and then, at the final mix, feeding ‘some grit’ into the mix via this mic feed. The demo result was convincing too.

1 Like

Edward_Nazarko…I agree

I am a photographer and have always (and still do) shoot in RAW but it is not really necessary. Very few of good images have been made to look better because they were RAW. I occasionally save a bad one (exposure or white balance) with the extra data in the RAW file. I do think the analogy between hi-res digital photography and hi-res audio is a good one. The sanctity of the RAW file and the hi-res digital audio file is exaggerated. So many other factors go into producing the final sight/sound as to make these two factors bit players in the drama rather than the stars. Everything in the music chain is lossy, colored, or otherwise modified in some form or fashion. Some say tube amps introduce “pleasing distortion” or analog is warmer and more “musical” than digital. Speakers, electronics, cables all are lossy and have an editorializing effect on what we hear. Whether it is a “real” hi-res digital audio file or the MQA “Cubic zirconia” version, I don’t think in either case it makes that much difference.

They are likely programmed for different songs because different music tracks have different ‘tempo errors’ and are selected to optimise the transient response. This is actually done at the MQA encoder (MQA Ltd). Undecoded MQA is already treated with pre de-blurring filter at this stage. This optimization is for music contents only.

At the playback end, the renderer will do a post de-blurring filter that is optimised for the particular DAC chip. Therefore, different MQA DACs will have different filter implemented. The selection of 32 filters will depends on the information that is already programmed during the encoding stage (MQA Ltd). The process has to be lossless in order to maintain the bit perfect for both decoder and renderer to work.

1 Like

Ah, so this explains why even the folded track already sounds different than the original. And if the deblurring is using some kind of minimum phase filters, why it sounds worse with dacs and HQPlayers applying a linear phase filter for the upsampling.

Totally agree! The soundstage depth and pinpoint imaging is diminished with MQA vs HighRes. MQA is a bit richer/fuller sounding.

You have understand that all MQA files has already been pre-treated with de-blurring filter during the encoding stage by MQA Ltd, (you can say it is close to a minimum phase slow roll-off filter but very much weaker when it come to attenuate images). When you applied a linear filter and up-sampling, it actually creates more distortion because of filters mismatch. Bear in mind, all PCM masters use exclusively linear phase filter during A/D conversion. When you play them back at the DAC side, the same standard linear phase filter is used. There’s little shift here. My argument is MQA is designed to work with it own filter for playback.

In case if you don’t have a renderer, it is best to play directly after it gets decoded up to 96k and feed directly to a conventional DAC, in the case of Roon.

1 Like

I agreed with you. MQA tends to emphasis on vocals and instruments and appear like they ‘stand out’ and more forward. The signature is more smooth and sweet. This type of impression are great for vocals and Jazz music. For classical music it is more towards more depth, 3D soundstage and imagining where conventional PCM and even better DSD are very much better in reproducing them.

1 Like

I’m not surprised. The trick of adding grain to a photographic print - just below where you’d notice without looking - takes advantage of how the human brain works. We actively seek detail, and read more detail as more precision.

The trick began back when high end professional digital cameras had 4mp sensors, and dynamic range was small. Skies or smooth surfaces got a glassy, digital look to them when printed large. Someone at Duggal Photo in NYC was fiddling and added some noise, and… that’s been passed on through professional printers since then. It’s still done even with images from 150mp sensors, because it’s such a nifty trick.

Note that similar things are done to almost every studio recording. After flawlessly recording the musicians with zero room interactions, during the mixing reverb gets added at some level because the dry recording sounds awful - cognitively awful. The recording engineers I’ve seen work have often added slightly different reverb to each instrument, and that makes your brain perceive more space. So all that “adding back the environment” talk about MQA… for studio albums, you’re adding back the artificial environment created at the mixing board.

1 Like

The other thing is many of us thought that the renderer stage have further unfolding done but in actual fact this is not true. The authenticated flag showing the original sample frequency (OSF) in all MQA DACs are NOT the actual sample feeding the DAC chip (Don’t get tricked!). Secondly, since Roon will able to do post DSP after it gets decoded prove one very important aspect; music contents can be changed (no longer bit perfect) before going to the renderer stage. All it need is for Roon to extract the renderer information first, DSP the music contents, then put back the renderer info. Since the music contents have already change so much, no longer bit perfect, there’s no way to do further unfolding (extracting more information). The renderer information is meant for renderer to select the correct MQA filters (32 of them) and a final oversampling if necessary before feeding the DAC chip. It is more of doing conditioning rather than unfolding. All unfolding is already done at the decoding stage. So in fact MQA is ONLY good up to 96k and resolution up to 17 bit.

2 Likes

Has anyone tried to see if they can hear the difference between 192k vs 96k MQA tracks?

Looks like I still have a few days left on my Tidal subscription, so I could try this test with my ifi idsd micro doing the rendering.

I’ve done this with Qobuz hires files listening to Beck’s Sea Change album and can hear a slight bit more air and depth. Very subtle.