For context, LDAC is one of the few wireless audio codecs stamped Hi-Res by the Japan Audio Society and its encoder is open source since Android 8, so you can see just how long Windows is sleeping on this. I’m excited about the incoming next gen called LC3plus, my next pair is definitely gonna have that.

  • drwankingstein@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    27
    arrow-down
    4
    ·
    edit-2
    1 year ago

    Ldac is not actually that good, it’s actually fairly rare that LDAC beats out something like SBC XQ let alone AAC

    EDIT: for elaboration, LDAC works at 3 main data rate ranges 990/909, 660/606 and 330/303. Ldac is only high res at the 990 range, and even at that range, it still often looses when pipewire is compiled against libfdk. keep in mind that it’s hard to get real numbers on LDAC because decoding is proprietary, meaning I had to disassemble headphones and connect those for verification, but typically AAC on supported headphones beat out 990kbps LDAC (which is hilarious btw considering LDAC can rarely actually work at 990kbps anyways) and both SBC-XQ and LC3Plus (both of which are usable with pipewire) regularly beat 660kbps LDAC.

    TLDR LDAC is crap and SBC-XQ is typically more accurate and lower latency, and LC3Plus is even better then that. and if you have AAC compatible headphones assuming latency isnt a major issue (which you are using LDAC so it’s not) just use AAC, both fidelity and latency is better

    EDIT: I should mention, it is known that vendors will tune codecs, I believe Valdikks article in habr briefly goes over this. so it’s very possible that tuning could mean that x codec, including LDAC could be the only good codec, however with how badly LDAC maintains 990kbps, I doubt it will make much of a difference

    • RunAwayFrog@sh.itjust.works
      link
      fedilink
      arrow-up
      22
      arrow-down
      2
      ·
      edit-2
      1 year ago

      keep in mind that it’s hard to get real numbers on LDAC because decoding is proprietary

      I used to think the same. But as it turns out, a decoder exists. Maybe some people don’t want anyone to know about it to keep the myths alive ;)

      EDIT: Also, as a golden rule, whenever anyone sees the words High-Res in an audio context, they should immediately realize that they are being bullshitted.

    • marmo7ade@lemmy.world
      link
      fedilink
      arrow-up
      14
      arrow-down
      7
      ·
      edit-2
      1 year ago

      Tell me you’re an apple fanboi without telling me you’re an apple fanboi.

      LDAC is “only” high res @ 990. OK? WTF is your point? It sounds better than every other codec.

      it still often looses when pipewire is compiled against libfdk

      Can you explain the practical implication of this when I listen to music on my Pixel phone? (spoiler: there is none)

      but typically AAC on supported headphones beat out 990kbps LDAC

      100% total bullshit. Here are the tests:

      https://www.soundguys.com/ldac-ultimate-bluetooth-guide-20026/

      AAC does not have better fidelity. What a joke of a claim.

      • drwankingstein@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        9
        arrow-down
        3
        ·
        edit-2
        1 year ago

        at 990/909 kbps bluetooth can hardly hold that bitrate unless you have really good conditions so much as walking down a stream will bring it down to 660kbps

        and yes, AAC does have better fidelity, at 320kbps AAC and Opus are largely transparent to 90% of users keep in mind I am comparing fdkaac on Pipewire, NOT android, this is an important distinction since they were testing android, and you can see here how spotty AAC is on android https://www.soundguys.com/the-ultimate-guide-to-bluetooth-headphones-aac-20296/

        I am talking specifcally about linux in this context

        EDIT: also it’s not about being an apple fanboy, Opus is largely just as good, marginally better, but no headphones support them, if you want you can even compile pipewire with higher bitrate limits on opus for stereo, (IIRC the pro profile can override it? cant remember but code is here https://gitlab.freedesktop.org/pipewire/pipewire/-/blob/master/spa/plugins/bluez5/a2dp-codec-opus.c)

        • neo (he/him)@lemmy.comfysnug.space
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          opus is transparent for all the the most intense songs by 160kbps, and for regular stuff you’d hear on the radio it’s transparent anywhere from 96kbps-128kbps

          • drwankingstein@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            while this is the case for a lot of songs, a lot of instrument heavy songs can cause noticeable artifacting for some people. It’s pretty rare, but in the end, it’s not like we are storing the media so why care? we can do upto 320kbps for a stereo stream, and as far as I am aware, it’s not like there are any detriments to doing so (maybe marginally higher power usage I guess).

            I wasn’t able to myself, but I did have a friend test the snug space endless lane, and they were able to fairly reliably tell the difference between 128kbps and the original rip. the In the moonlight track too has a high pitch… triangle maybe? that can exhibit artifact too.

            so like, yeah, but we have the 320kbps we can work with, so like, why not?

      • denny@feddit.deOP
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        You all got a valid point… it’s just that mileage varies and x codec will sound better in y combination. If I remember right, AAC on Android is at times implemented differently than on it’s home Apple: The encoder would work with smaller bitrates to save battery. There must be a special synergy for max bitrate LDAC to sound worse than AAC, indeed. All in all my post is about being open minded and giving you the option to use a thing, rather than finding out what codec is universally the best: You virtually can’t, can you?