Nobody needs lossless over Bluetooth
Edit: plenty of downvotes by people who have never listened to ABX tests with high quality lossy compare versus lossless
At high bitrate lossy you literally can’t distinguish it. There’s math to prove it;
https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem
At 44 kHz 16 bit with over 192 Kbps with good encoders your ear literally can’t physically discern the difference
Nobody “needs” to listen to music over Bluetooth at all, but why not make it sound like it’s supposed to?
Why use lossless for that when transparent lossy compression already does that with so much less bandwidth?
Opus is indistinguishable from lossless at 192 Kbps. Lossless needs roughly 800 - 1400 Kbps. That’s a savings of between 4x - 7x with the exact same quality.
Your wireless antenna often draws more energy in proportion to bandwidth use than the decoder chip does, so using high quality lossy even gives you better battery life, on top of also being more tolerant to radio noise (easier to add error correction) and having better latency (less time needed to send each audio packet). And you can even get better range with equivalent radio chips due to needing less bandwidth!
You only need lossless for editing or as a source for transcoding, there’s no need for it when just listening to media