• m-p{3}@lemmy.ca
    link
    fedilink
    arrow-up
    49
    ·
    1 year ago

    Looks like some CSAM fuzzy hashing would go a long way to catch someone trying to submit that kind of content if each uploaded image is scanned.

    https://blog.cloudflare.com/the-csam-scanning-tool/

    Not saying to go with CloudFlare (just showing how the detection works overall), but some kind of builtin detection system coded into Lemmy that grabs an updated hash table periodically

    • wagesj45@kbin.social
      link
      fedilink
      arrow-up
      25
      ·
      1 year ago

      Not a bad idea, but I was working on a project once that would support user uploaded images and looked into PhotoDNA, but it was an incredible pain in the ass to get access to. I’m surprised that someone hasn’t realized that this should just be free and available. Kind of gross that it is put behind an application/paywall, imo. They’re just hashes and a library to generate the hashes. Why shouldn’t that just be open source and available through the NCMEC?

      • shagie@programming.dev
        link
        fedilink
        arrow-up
        27
        ·
        1 year ago

        Putting it behind a 3rd party API that has registration ensures that the 3rd party that is under contract to report it does so. It isn’t enough just to block it - it needs to be reported too. Google and Cloudflare report it to the proper authorities.

        Additionally, if it was open source, people trying to evade it could just download the open source tool and tweak their images until they come back without getting flagged.

        • wagesj45@kbin.social
          link
          fedilink
          arrow-up
          16
          ·
          1 year ago

          They could tweak their images regardless. Security through obscurity is never a good solution.

          I can understand the reporting requirement.