• m-p{3}
    link
    fedilink
    5010 months ago

    Looks like some CSAM fuzzy hashing would go a long way to catch someone trying to submit that kind of content if each uploaded image is scanned.

    https://blog.cloudflare.com/the-csam-scanning-tool/

    Not saying to go with CloudFlare (just showing how the detection works overall), but some kind of builtin detection system coded into Lemmy that grabs an updated hash table periodically

    • wagesj45
      link
      fedilink
      2510 months ago

      Not a bad idea, but I was working on a project once that would support user uploaded images and looked into PhotoDNA, but it was an incredible pain in the ass to get access to. I’m surprised that someone hasn’t realized that this should just be free and available. Kind of gross that it is put behind an application/paywall, imo. They’re just hashes and a library to generate the hashes. Why shouldn’t that just be open source and available through the NCMEC?

      • @shagie@programming.dev
        link
        fedilink
        2710 months ago

        Putting it behind a 3rd party API that has registration ensures that the 3rd party that is under contract to report it does so. It isn’t enough just to block it - it needs to be reported too. Google and Cloudflare report it to the proper authorities.

        Additionally, if it was open source, people trying to evade it could just download the open source tool and tweak their images until they come back without getting flagged.

        • wagesj45
          link
          fedilink
          1610 months ago

          They could tweak their images regardless. Security through obscurity is never a good solution.

          I can understand the reporting requirement.