• sp3tr4l@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    6 months ago

    Well, better regulation of algorithms is not a thing that is going to happen.

    Assuming you could actually specify this kind of content… which you probably can with some degree from the standpoint of the engineers behind the things… theres basically no way to ban or limit this kind of content in a law.

    1: Giant Freedom of Speech based opposition. To some extent, yeah if you penalize it, well you are limiting free speech and artistic expression, is what will be claimed.

    2: Without literally having access to the way the algorithm works, it’d be a massive tome of a law to try to pass. And also software changes, so … you can probably rewrite your way around a specific way to limit this kind of content.

    I don’t know. Maybe you could pass a law that mandates if your platform has x many users or daily views, you must provide to the user far, far more in depth means to manage their own content they are thrown up.

    Or perhaps you could have some kind of FAA type entity created, which is supposed to be deeply involved in the behind the scenes aspects of basically standard operation of the social media industry, as the FAA is with aircraft manufacture/airspace/airports.

    Of course the counter point to that is well just look at the FAA and Boeing ot even SpaceX. Regulatory capture is a thing, and with both Boeing and SpaceX it seems like the FAA (and in SpaceX’s case the EPA) either don’t really care to do their jobs, or actual enforcement mechanisms are just too slow or cumbersome.