Hello everyone,
We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
We keep working on a solution, we have a few things in the works but that won’t help us now.
Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.
Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.
But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.
Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.
I hope the devs take this seriously as an existential threat to the fediverse. Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers. If taking the community down is the only option here, that’s extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.
It doesn’t bode well.
Lemmy is new for all of us. I don’t see any other solution right now. You got some ideas how to handle it better?
I think better mod tools are needed but it will take time. Doesn’t mean the platform will die, but means we may have to deal with some stuff like this.
It’s a hard problem but it absolutely is an existential risk. Spam is an existential risk. A platform that collapses under spam will either remain too small to be irrelevant or collapse from unusability. I’m sorry but I don’t think your response completely grasps the number of forums, social media sites, wikis, etc. that have been completely crushed by spam.
I admit I haven’t kept track of that, true.
Solution now. Better solution later.
That’s what I’m hoping - we can’t just burn down any community with spam
I’ve just finished arguing with other lemmy users about how admins aren’t interested in taking on your legal risk. That was for the topic of piracy. CSAM is another issue entirely. Not only can lemmy users not expect to see a CSAM-friendly instance, lemmy users should expect to be deanonymized by law enforcement. Fuck around with kids and find out.
Downvote this message if you are a pedophile, as I’m taking the stance that CSAM should not be allowed on lemmy servers.
Are you seriously conflating my position with arguing that CSAM should be allowed?
Are people having a difficult time reading today? It’s not just you. Maybe it’s this topic and how it intermeshes with technology. Some people seem to think that there’s a technical solution for this already (one that works as well if not better than human moderators).
No, I don’t think you personally are advocating for CSAM to be allowed. I think commenters are getting a little uppity about missing out on their favorite community while the admins deal with content that is:
Imagine you owned an instance, and you found 100 moderators for your communities. You rest your head on the pillow and go to sleep. You wake up and find that some user has written a script to post CSAM on all your communities, because “fuck you that’s why”. You get on the line with your moderators and they tell you they’ve been battling this all night, just banning people and deleting comments on site. They tell you they’ve had to turn off a few communities and that some users are complaining. Your hard work for weeks and months to get this instance to a healthy place is being tested. You get an email from your hosting service, saying that they have reports that your site contains CSAM and that’s against ToS - they give you a day to get it under control before they boot your server or turn it over to police. Imagine in this case you make the drastic move to simply pull the plug - taking the entire instance offline until you can sort it through. Now imagine some users come in and start complaining about how you dear admin are killing the fediverse. Personally, I have no sympathy for those user who complain about their community or instance being taken offline while admins deal with real shit.
What praytell the fuck do you mean by this term specifically
I’ve spent the better part of this morning explaining to people the fact that a community needs to be shut down in order for volunteers to work on cleaning it up in the time they have available.
commenters seem to be pretty upset that something as “drastic” as turning off a community needs to be done. Some commenters have gone so far as to say that the policy of turning off communities in response to handling CSAM is what will “kill the fediverse”.
I think the normal response to this is: “Wow, this sucks. Thanks admins for doing your best work. I understand the community make not come back for a bit, take all the time you need!”. Yet, I hear “it’s the dev’s fault for not putting in the code for blocking CSAM and taking a community offline is unacceptable”. I call that “upity” but there’s probably better words for it.
So 4chan has this problem a lot but they are also based in the US where its most definitely illegal and they IP ban people and I think for the most part it works. It did suck though - I don’t go on there anymore but in the last few years I did, if I was on mobile, I would often get hit with a region ban because so many people in that area were banned that they just decided to block an entire IP region to prevent anyone else posting illegal content.
maybe look into IP and region banning to prevent someone from just making new accounts.
You’re discussing how to ban people, this isn’t the problem.
The problem is this: In the last hour, 10,000 images were uploaded. Some of those contain CSAM. Now, you have 1 hour to find all the CSAM photos (0 to 10,000 of them). In the next hour, another 10,000 images will be uploaded, some of them containing CSAM…
Unless you have a lot of human moderators, you’re going to use automated tools and get false-postives or false-negatives.
A site like 4 chan banning whole regions isn’t a great example of handling this well. I don’t think I need to explain (but maybe I do) that one person in a region who is posting CSAM doesn’t mean the entire region posts CSAM. You could just opt to block all regions by pulling the site off the internet. Not to mention, does this now mean that 4 chan allows CSAM for certain regions? Yikes. “Children can be abused only in these countries” “I’m sorry but your countries laws prevent images of children being abused, so this content is banned”. Yikes.
Again, the technical issue isn’t on banning. Here’s the code to ban user at IP 1.2.3.4:
if ( $_SERVER[‘REMOTE_ADDR’] === ‘1.2.3.4’ ) { die(‘nope!’); }
Here’s the code to ban a user at a specific region (pseudocode):
$geoip = new GeoIPDB(); $region = $geoip->get_region( ‘1.2.3.4’ ); if ( $region === ‘USA’ ) { die(‘nope!’); }
This isn’t difficult.
Now, for the code to DETECT CSAM:
look for skin tone tints (take into account all skin tone colors), look for quantity of skin on image (this would make close-ups of arms possible nude detections), detect a person in the photo, determine the person’s age by the photo… don’t detect images of art or of artful nudes, etc… or you know this is a lot of work, let’s make the humans detect instead.
Region banning would prevent anyone in the area from posting. I even mentioned that I use to come across bans for other people. In the case of 4chan, when they region ban, its possible someone else will be prevented from posting.
Now, if you want to talk about legality in other countries - that’s a different discussion. The internet is open to the WORLD. And all I would be comfortable confirming is that it’s definitely illegal in the US where I am. I’m not gonna get into other countries where it might not be illegal. I don’t know enough about those places to be able to tell you more.
Basically a region ban would be similar to just pulling that instance down. Preventing whatever region that person was posting in would prevent them from posting as well as making local accounts to try and post more.
When I would be downtown where I live, and got a ban that wasn’t meant for me, but I was in the region that was banned, I was able to appeal my ban. In order to appeal, you have to be good at using your words because a person has to sit there and read the appeal to make the decision to unban or not. Mine always went through but I also am capable of talking things out and I’m smart enough to know when to properly explain myself.
Other people didn’t get their appeals and I would see them complain about it elsewhere.
Anyway, you don’t need to condescend to me. I’m not against what you’re saying. I agree with a lot of what you said in other comments.
I mentioned this before but I’m sorry that I didn’t see who I was responding to. I usually respond on the internet to ideas, not people. Today I’ve been responding a lot to the idea that CSAM is easy to fix and that for reasons unknown it just hasn’t been done with lemmy and the way it’s being done with lemmy isn’t “the right way”.
GeoIP databases aren’t perfect, which is another problem entirely. It’s better than pulling the plug on the entire internet, sure, but it has its own problems.
I was responding to the idea of gating csam content via geoip as “yikes” because I can’t find myself personally allowing CSAM in some countries, because it’s “legal” in those countries. This is a moral argument I’m making, but I am happy imposing the US law as it relates to CSAM being illegal (not US law such as FOSTA/KOSA, etc… those are a different can of worms entirely) on other countries. Or to put it another way, as an admin, if I get an email saying “actually bro in country xyz we get to abuse children”, it won’t sway me into allowing that content in that country. IF someone in that country wants to put up a site for that country, that’s their problem (and if I could intervene and prevent them from doing so, I would).
Right, it’s definitely not an easy fix and Lemmy doesn’t even operate the way other sites do but today I’m learning that using these instances seems to be easily exploitable.
The reason I mentioned region banning is because it definitely worked. There weren’t people uploading 10000 images of CSA cause if you tried to, you’d get banned so hard that you’d ruin it for other people posting near by.
I agree. Honestly, if I was in charge in anyway - those countries just wouldn’t be allowed access. And that does happen. I use to work for an app where we had people working in the Philippines who couldn’t access the app itself. We had to just give them info and they would feed it to the customers. And it was because their country is blocked from viewing the app in the first place. They’re just straight up not allowed to use it there.
Like I’m totally with you. Fuck MAPs, fuck all of em. If some archaic country still participates in something that is obviously harmful to people - yeah, impose these laws on them. Tell them to fuck off until they stop this shit.
And lets be real. It’s gonna be years before they ever stop.
How does an IP Ban work when this attack came through a different, legitimate, federated Lemmy server?
I don’t think the comment above was trying to express dissatisfaction towards Lemmy’s hosts for failure to respond. They’re simply stating that the way things are all set up, much as we might like it, has serious problems - ones that may end up being considered unsolvable. As you said, we might be heading for an eventual plug pull.
It’s like pointing out that cars produce fossil fuel exhaust. It sucks, and we’re seeing it as unsustainable, but there’s no convenient alternative yet.
Things are setup the way they are because it’s the best way that admins (not just of lemmy instances but of major sites like reddit and facebook) have found to handle these situations.
You could take it a step further and give law enforcement their own backdoor to your site, as Facebook has done, but I would not advocate for that solution. We are in a special place in the internet where we can somewhat self-police our own content, assuming we actually self-police our own content. The way we do this is the way these admins are currently handling this.
It may be reasonable to think that sites like reddit and facebook have it all figured out, but all they have is similar code to what lemmy has, but with a bit more money to pay some content moderators on trust and safety to actually remove this content before users get a chance to see it. The difference between those sites and lemmy is $$$ and that’s not something that’s likely to change anytime soon.
Sorry to hear about your investment in lemmy. How much did you end up investing? It just sounds like you’re very unsatisified with the value that lemmy has provided.
Personally, I don’t pay for lemmy. Lemmy is free as for as I understood it. As it being free, I can’t really dictate the legal risk that the admins have to go through, as I do not have power over them, and because I treat them as humans.
But yeah, I guess if you have a good reason, they really should be falling over backwards to moderate all the CSAM away from your favorite community. You are an all-powerful being.
Edit: Sarcasm on the internet doesn’t work well so let me be frank: admins aren’t responsible for going to jail for a user’s desire to post CSAM. admins have a right to shut down a community that posts CSAM or remove CSAM or any material they find objectionable from their site. Admins take on the legal risk of the content on the site and OWE USERS NOTHING. Y’all can “the customer is always right” all you want but if you aren’t the one paying you aren’t the customer and you aren’t right.
I feel like you didn’t actually read their comment before posting, !dipshit@lemmy.world
It has nothing to do with Lemmyshitpost being their “favorite community” and they never mentioned “investing” or “value”. That’s all from you. Stop strawmanning their position. They were criticizing the ease with which entire communities can be taken down by single individuals. Additionally, it seems you are contradicting your own post from 20 minutes prior to your current comment. Perhaps you responded to the wrong comment?
I read it. Let’s read it together again.
The developers who build lemmy aren’t able to put in CSAM blocking code. That’s not how this works. I assume the commenter meant to say “admins” here, as developers write the code, they don’t admin the sites. If a developer has a lemmy instance they admin, they they are both a dev and an admin. Lemmy wasn’t built for CSAM sharing specifically, it is a site that allows sharing of CSAM as much as reddit or facebook do. The devs can’t do much about this. The admins and mods can.
Neat. Irrelevant, but cool.
This I take issue with, and is what I mostly responded to.
“If taking the community down is the only option here” well no, it’s not. We could just get 100’s of mods to specifically address this one user’s posting of CSAM. Hey, anyone want to moderate the site? Oh right, and they’ll need to be vetted, and they’ll need to keep doing this on the side for free as volunteers since lemmy is volunteer run…
“that’s extremely insufficient” hard disagree. A community is liable for the content on it. If we put a CSAM post up on a site and leave it around for a few minutes, that’s one thing. If it’s left up for days and weeks, that’s quite another problem entirely. The minute that an admin or mod saw CSAM material, they did the right thing by shutting that down. Even if it means downtime for users. Oh no! Users can’t read lemmyshitpost and now the world is ending.
“and bodes death for the platform at the hands of uncontrolled spam.” Welcome to the internet, where all platforms are at the risk of uncontrolled spam. At first it was just email, but then it was bulletin boards, and then message boards, and then forums, and then community-moderated forums like reddit and lemmy. This has and will be a problem. This isn’t a new concern for lemmy devs or admins or mods, they all are aware that this can happen and is why they do what they do. Turning off the community is a viable option, and is what has happened in larger companies too while they cleaned up the mess.
I’ve been very consistent in my arguments. Show me the contradiction and I’ll address it.
TL;DR: users cannot expect to be allowed to post CSAM material on lemmy instances. Allowing CSAM material to be up on lemmy instances constitutes a legal risk for admin owners, and thus we cannot leave it up. Blocking a community (even if it’s like the bestest and most favorited and most subscribed and everyone loves it and wow just super-duper community) is a viable means of blocking all CSAM on that community while it is cleaned up. To suggest that the community should have stayed online is assinine. To suggest that the admins should not have blocked a community to combat CSAM is assinine. Trust admins to do their jobs.
They aren’t asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.
NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the “material to be up on lemmy instances”. NOBODY is suggesting that a single instance going down is “the world is ending”. NOBODY is asking for “100’s of mods to specifically address this one user’s posting of CSAM”.
You’re setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring “100’s of mods”, which is what resulted in the condition that “taking the community down is the only option here”.
Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.
Yeah? I doubt this is true but I could be wrong. You make it sound like preventing CSAM is as simple as importing a library, something I find dubious. Companies have been trying to filter out this material in an automated fashion for decades and yet they still have to employ humans to do it manually because automated means don’t really work. This is why companies like Reddit, facebook have trust and safety teams to do this work.
Edit: I goggled and could not find this database. I’m thinking it’s a myth.
ahem there were users who uploaded CSAM. Those are the users who were advocating for uploading CSAM, becuase they uploaded CSAM.
I’m literally arguing with people who are saying that they shouldn’t have shut down the community because it’s big and that shutting down the community (not CSAM) poses an threat to the fediverse. Maybe, but CSAM poses a legal threat, which is much greater than the threat of low engagement.
Yeah, that doesn’t exist, as I’ve mentioned previously. You make it sound like getting CSAM off lemmy was as simple as writing some code - if it were, why doesn’t facebook and reddit do this?
You’re not understanding how CSAM detection works or is handled.
The grim reality is this: cameras exist, children exist, adults exist, the internet exists, and the second that a crime is committed, it is not added to an FBI database. If such as FBI database existed and IF it was useful (and not just a database of hashes for bit-perfect copies of CSAM) and IF it were updated when evidence of the crime surfaces… IF all of those things are true, THEN it means there’s still likely a huge swath of CSAM material still out there, that could be posted at any time, and that would NOT be detected.
Again ask yourself, IF such a database existed, then WHY does reddit, twitter, facebook, hell, why doesn’t every or any site use it?
Pedophiles, instead of downvoting me, why not explain yourself?
As another commenter posted below:
As far as I am aware, every major site does use it in addition to manual vetting for any flagged “borderline” or “uncertain” results caught up in the filter.
Db0 even created a tool for Lemmy:
https://lemmy.dbzer0.com/post/2896209
I think this is where you could be wrong here. I appreciate the links, I’ll look into those in more detail. My best understanding is that these tools generate so many false-positives and false-negatives that it’s not worth using them. It may be a first line of defense until real humans get to see them, but my point is that humans are still needed. When humans are included because the system isn’t 100%, it means humans do the labor and as such, with limited time, humans need to determine when they can do the labor - sometimes shutting down a community is the best way to stop the flood while they clean up the mess.
This is just a matter of confirmation bias from your side now. You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question. And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.
They absolutely can, and every forum under the sun has tools and extensions to help with this. Fucking 4chan has code specifically dedicated to deal with CSAM. You have no clue what you’re talking about.
Replace this with !technology@lemmy.world, or !selfhosted@lemmy.world, or !announcements@lemmy.world. “Oh no, users can’t read the entire site” yes that is the definition of the end of the site.
You’re not seeing that this isn’t a lemmyshitpost issue, it’s an “any popular community on lemmy” issue. Snarkily taking potshots at lemmyshitpost as a community doesn’t change it.
It’s not “not an option”, it’s the last resort. It’s like saying that your only option to seeing a roach in your apartment is to burn the whole building down. Because doing it means you don’t have a community anymore, and without communities the site has no purpose.
Cool, name them and give me links then. I could not find such on the internet. There is software that tries to detect this, but even youtube’s algorithm is incorrectly flagging fully clothed 30 year old women as children.
Links or you’re talking out your ass.
Replace a site with CSAM and you’ll find it’s not a site you’ll want to go to in the first place. READ the original post where it was mentioned this is a stop-gap measure.
You’re… offended that I had snark on the topic of lemmySHITPOST? surely, you are joking.
My point is not that this community is shit and that’s why this happened.
My point is that this is a community on a lemmy instance that was flooded with CSAM, and was shutdown because of the flood of CSAM.
You do see how turning a community off and then on again isn’t the same thing as burning down a house (and unburning it again?)
You do realize that we’re talking about a literal crime against children vs your ability to see memes? Fuck off with your self-importance.
Are you being intentionally dense or do you not understand that it’s my point? If someone can flood lemmy with CSAM so easily that the only way to stop it is a site shutdown, then there are not sufficient mitigation measures in place.
Yes, this is the Internet. Take your statement and replace “lemmy” with “reddit” “facebook” “9gag” “imgur” etc… No site has “sufficient mitigation measures in place” as CSAM continues to flood the internet.
“Flooding” a site with CSAM is a matter of opinion. If one person posted one image of CSAM on my instance, that would be flooding - that’s one image too many. It’s not like there’s some magic threshold of the amount of CSAM allowed on a site. All sites use human moderators to detect CSAM and all sties who do this have teams that are far too small and far too underpaid for the most part.
Underpaid being the keyword here, as lemmy admins are volunteers. I would think that the threshold for “flooding” a lemmy instance with CSAM would be far lower than that of a major for-profit site.
it’s true, if I remember correctly, tumblr was removed from the App Store because of CSA issues. I could be remembering wrong and maybe it was the Google Play Store.
have you ever talked to janitors and mods on 4chan? Good luck getting any info out of them.
Do you realize that 4chan isn’t the full internet? That these programs that you already know of can exist outside of 4chan? I’m asking you - the person who knows of these apps - to provide links to back up your claims.
I’m not the other person you responded to and I never claimed to have an apps or links.
I’m just telling you how this works on 4chan. I’m aware that’s not the entire internet obviously - your sarcasm needs work considering we are both here on Lemmy, ie, not 4chan.
If anyone on there is using these programs/apps/whatever, they’re not just gonna tell other people about them.
And as far as I know, I haven’t been on 4chan in like 3 years not but they region ban for CSAM.
Everyone got your sarcasm. We just think the Lemmyverse has no chance when it’s flooded with child porn
I’m a victim of CSAM and my dad exploited me for several websites.
I get being upset about this. But it’s not the end of the world for a site. Lemmy is still totally fine and I have been using it without seeing any CSAM and the only knowledge I even have of this is from posts like OP’s.
Like this isn’t a good time to be just down on the site and pessimistic.
Removed by mod
People are downvoting you because you’re acting like a dick.
deleted by creator
People have been, but you’re not truly listening, Internet Warrior.
Whatever you say, kid.
I’m over 50, but you keep doing you, Internet Warrior, as it just proves my point.
I agree with a lot of what you said and upvoted you but you really need to just stop calling people pedos for disagreeing with you.
I’m a victim of CSAM myself and you can take a look through my comment history where I talked about it in depth more. I hate pedos just as much as you do but going around calling people pedos isn’t going to do anything but upset people.
I’m taking the radical stance that CSAM isn’t a good thing, should be reported to law enforcement and that the site with CSAM can be shut down as a viable option for handling CSAM material.
I’m getting downvotes from people who disagree with me on this “radical” stance. People who disagree that CSAM is a problem, that CSAM is a concern. I don’t have a lot of sympathy for people who promote CSAM like the people who downvoted my posts. I don’t care about the loss of internet points, I care that these worthless shits are still on lemmy, so yes, I call them what they are.
I mean, I think people are downvoting you for other reasons.
Obviously I agree with you that CSAM is bad. It happened to me and ruined my fucking life for like all of my teen years and then most of my early 20s.
But calling people names is pointless. Especially when it comes off like a baseless accusation.
Noted. You’ll have to excuse the fact that I don’t really care about calling people names on the internet if the content of their message promotes abuse.
Yeah I get that for sure. I mean, if I knew someone was some kind of MAP idiot who was trying to fight for the rights of pedos, I’d call them names too. Idiot seems fitting for that lol
You’re completely misinterpreting everything we said. If we would shutdown every site with CSAM, the internet wouldn’t exist. We don’t disagree that CSAM isn’t a problem. We disagree with your solution.
Not at all. I am completely underestanding you.
You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.
My solution which is to remove CSAM? My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?
Another question for you: if your house is flooding due to a burst pipe, what do you do first:
a) get all the water out of the house b) turn off the water coming into the house.
my solution would be to do step B followed by step A. Your solution appears to be to just do step A, which means you’ll constantly be flooded and never have enough manpower to dry your house.
I’d bet money that the following will happen:
In the meantime, folks missing the community are free to go elsewhere on the internet. Why? because CSAM is a crime which depicts Sexual Assult and the evidence is posted online. It’s not a matter of just deleting content, it’s also a matter of turning over the people posting that content over to the police so they can be held accountable for their crimes.
Sorry let me word this correctly: social media wouldn’t exist.
No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.
The problem is that it won’t stop and that it will happen again.
You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.
Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?
It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.
Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.
They have very low to zero legal risk, as long as they’re doing their job.
IANAL, but I can read laws.
Correct, emphasis mine. As long as they take action when it happens being the key phrase here.
IANAL but from what I understand, doing something to take action (removing content, disabling communites, banning users, all of the above) shows that they are working to remove the content. This is why previously when having conversations with people about the topic of piracy I mentioned DCMA takedown notices and how the companies I’ve worked at responded to those with extreme importance (sometimes the higher ups would walk over to the devs and make sure the content was deleted).
I’m annoyed at people in this thread who believe that the admins did the wrong thing, because turning off communities could cause users to go to another instance - who cares, this is bigger than site engagement. I’m annoyed at people who think that the devs had access to code which could prevent this issue but chose not to implement that code - this is a larger and much more difficult problem that can’t just be coded away, it usually involves humans to verify the code is working and correct false-positives and false-negatives.
You misunderstood what I meant by the part that you highlighted of my comment.
I’m speaking of Safe Harbor provisions, not having to take active DCMA actions. They’re two very different things.
I have a recurring donation to the instance, but that’s aside the point.
You’re right, it is. You may be the sole person donating, and maybe you of all people have the “right” to have your opinion “respected” for donating. My point is that by and large, the CSAM posters and most people who use this site aren’t directly paying for a service which contractually obligates them to take part in the site or service, let alone by posting CSAM.