342
Lemmyshitpost community closed until further notice - Lemmy.world
lemmy.worldHello everyone, We unfortunately have to close the !lemmyshitpost community for
the time being. We have been fighting the CSAM (Child Sexual Assault Material)
posts all day but there is nothing we can do because they will just post from
another instance since we changed our registration policy. We keep working on a
solution, we have a few things in the works but that won’t help us now. Thank
you for your understanding and apologies to our users, moderators and admins of
other instances who had to deal with this. Edit: @Striker@lemmy.world
[https://lemmy.world/u/Striker] the moderator of the affected community made a
post apologizing for what happened. But this could not be stopped even with 10
moderators. And if it wasn’t his community it would have been another one. And
it is clear this could happen on any instance. The only thing that could have
prevented this is better moderation tools. And while a lot of the instance
admins have been asking for this, it doesn’t seem to be on the developers
roadmap for the time being. There are just two full-time developers on this
project and they seem to have other priorities. No offense to them but it
doesn’t inspire much faith for the future of Lemmy. But we will not give up. We
are lucky to have a very dedicated team and we can hopefully make an
announcement about what’s next very soon.
They also shut down registration
Whoever is spamming CP deserves the woodchipper
Looks like some CSAM fuzzy hashing would go a long way to catch someone trying to submit that kind of content if each uploaded image is scanned.
https://blog.cloudflare.com/the-csam-scanning-tool/
Not saying to go with CloudFlare (just showing how the detection works overall), but some kind of builtin detection system coded into Lemmy that grabs an updated hash table periodically
Not a bad idea, but I was working on a project once that would support user uploaded images and looked into PhotoDNA, but it was an incredible pain in the ass to get access to. I’m surprised that someone hasn’t realized that this should just be free and available. Kind of gross that it is put behind an application/paywall, imo. They’re just hashes and a library to generate the hashes. Why shouldn’t that just be open source and available through the NCMEC?
Putting it behind a 3rd party API that has registration ensures that the 3rd party that is under contract to report it does so. It isn’t enough just to block it - it needs to be reported too. Google and Cloudflare report it to the proper authorities.
Additionally, if it was open source, people trying to evade it could just download the open source tool and tweak their images until they come back without getting flagged.
They could tweak their images regardless. Security through obscurity is never a good solution.
I can understand the reporting requirement.
Works only if your server is hosted in the US