• Neshura@bookwormstory.social
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Also CSAM detection algorithms are known to misfire on occasion (it’s hard to impossible to tell apart a picture of a naked child sent for porn purposes and one not send for that) and people want to avoid any false allegations of that if at all possible.