Streamed High Res FLAC vs. AIFF?

Expectation / confirmation bias, but hey, whatever floats your boat.

The eye opener is the power of expectation bias. Roon is sending exactly the same data to your M Scaler and it is buffered. The extremely small amount of time it takes to decompress the FLAC file is not an issue. The extra disc access required by the AIFF file is just as “bad”…

1 Like

“Expectation bias” can equally go the other way too. You may be expecting them to sound exactly the same, so they do to you.

1 Like

Did you take the time to read the article you recommended? It actually explains why IN YOUR CASE (i.e. FLAC vs. AIFF) a blind test might be quite useful…

If one ear is 1" closer to your speakers than the other, that’s probably 100 us. of latency right there. Sounds travels through air at approximately 1 foot/ms. so two rooms could easily have 1 ms. of latency and be undetectable. Lip sync can be easily be 1000x greater than 20 us. In fact, most people can’t detect lip sync errors of 50 - 100 ms. Although sound preceding lip movement is much more detectable that sound following lip movement. The International Telecommunications Union did lip-sync testing and their paper is here.

True. If there is a permanent difference between L and R of, say, 100us or even more it would not matter much. However if the latency fluctuated during playback it would be easily detectable.

I adjust speaker distance to listening position to be zero mm for a stereo pair, primarily to be able to place a measurement microphone for room correction measurements, but also for proper focus in the soundstage.

Incorrect. Roon prebuffers ~5 seconds of audio to the playback endpoint. This is raw, uncompressed audio. FLAC (or any other audio file) decoding has zero impact on delivery of the audio to the audio to the hardware; it’s completely disconnected.

With networked endpoints (RAAT), they see the same thing (PCM). I’ve tested it with FLAC and WAV. They sound identical.

With devices where the decoding is happening on site, there perhaps may be differences between FLAC and WAV. CPU might be tiny bit higher for FLAC, but there is also higher I/O with WAV. Cumulatively, which is the worse?

For example, with BDP-1 running as a Roon endpoint, FLAC and WAV sound identical. With the BDP-1’s native MPD player, there is a very slight difference between FLAC and WAV. However, I’ve flip flopped between as to which sounds better. WAV perhaps comes across as more dynamic and open, but FLAC sounds more controlled and smooth. I use FLAC because of the metadata support and economical storage. (There’s also the issue that BDP-1 sounds better running MPD with FLAC or WAV than as a Roon endpoint. More things happening inside the BDP-1 with RAAT than MPD?)

Well, can you explain why a buffered stream of PCM would sound different when the source is compressed versus uncompressed? For arguments sake, let’s assume the extra cpu time for FLAC makes a difference. How would the extra disk I/O for AAIF not make a difference as well? Which is worse? I would expect I/O to be as it involves more circuits.

You are just stating a subjective opinion that is colored by expectation. I have actually listened to both file types doing blind tests very much looking for the best sound. I have looked for technical explanations for any difference and have no compelling reason for a sound difference.

Persuade us as to why there should be a difference and not that you think you hear one. Because I can guarantee you that you would fail a blind test.

1 Like

You’re making an assumption that file I/O is more detrimental to the audio than CPU processing for the realtime uncompress of a compressed file. I doubt this is the case, as has already been mentioned in this thread, pro recording and mastering studios never use compressed audio files, due to the processing delays and CPU overhead incurred when working with them. Maybe even due to quality loss issues.

Both types of files require file I/O. A compressed file must first be uncompressed to be used, an extra step and more overhead.

Persuade us as to why there should be a difference and not that you think you hear one.

I’m not here to persuade anyone of anything. I already know which type of file I prefer and recognize that others may prefer something else. I am however open to hearing others experiences/findings on the subject, in the spirit of inquiry, versus preaching.

That is subject to this:

The point is, if your DAC is connected directly to the core, than theoretically we have to make allowance for there to be some sort of difference based on the added load on the CPU. But that’s not a timing error issue, it is more likely electrical/noise.

But either way, by the time the audio signal transits the network to a separate RAAT endpoint, that is just a wav stream that doesn’t care how it got there or how hard a CPU had to work to decode and push it there. All that is relevant is how electrically noisy the endpoint is, and that has nothing to do with FLAC/WAV/AIFF.

1 Like

Not in my opinion. One of the first things I did when I got Roon was queue up AIFF, Wav, Flac and ALAC versions of the same song, a song I chose because it was very revealing and I was very familiar with it.

I could hear no difference.

My endpoint is separate from my core and music is delivered to it via Wi-Fi from my router. My core is connected to my router via Cat5 UTP.

Two non destructive format are equivalent and give the same binary content at the DAC input, so the sound will be the same at the analogue output of the same DAC