It looks like we always consider AirPlay to be “High Quality” as opposed to “Lossless”. Thinking back, I remember why.
There are two reasons for this:
- Many popular AirPlay devices perform internal format conversions that we know to be destructive (example: all AppleTV models convert 44.1k->48k).
- Since AirPlay streams are clocked at the source, they must either have fancy adjustable clock hardware that can conform to the stream coming from Roon, or they must perform DSP to resolve the clock discrepancy.
This second issue is slightly subtle. It’s our understanding that the vast majority AirPlay devices do not have the fancy clock chip that would be required for them to be truly bit-perfect, and instead modify the digital audio stream internally to resolve clocking discrepancies.
The AirPlay protocol gives us no way to know whether or not devices are committing either of these sins, so we’ve erred on the side of not attaching a false sense of sound quality to AirPlay devices as a whole.