• 0 Posts
  • 5 Comments
Joined 4 years ago
cake
Cake day: January 3rd, 2022

help-circle

  • You can’t stop people from distributing CSAM. How would you possibly enforce that? Might as well not even try.

    If the child didn’t want sexual materials of them distributed around, they shouldn’t have taken them in the first place.

    If you don’t want some creep to sexualize your children, then keep them locked inside your house, dummy. Your child has no right to privacy in public.

    /s

    Taking a photo of someone in the background is vastly different from following a private citizen to record them covertly, then posting the recording online to single them out and get people to harass them.

    Taking a photo of a child is not illegal, but posting said photo online with the intent to sexualize them is.

    Taking a photo of a person is not illegal, but posting said photo online manipulated to make them nude or doxxing/harassing them should be.

    The key here is intent. And that’s how it could easily be enforced by law.

    In case I didn’t make it obvious, most of your arguments can be ripped apart simply by replacing the focus of the argument from ‘noncensual derogatory use of likeness’ with CSAM.



  • Bozo take.

    Up until recently, you would need to have thousands of hours of Photoshop or visual effects experience to get an even mediocre result.

    With current AI, the barrier to entry is basically nothing and the results can often times be indistinguishable from reality.

    The solution is obvious…governments need to make non-consensual reproduction of an individual’s likeness illegal and actively enforce it.

    The tools are already out there. Regulating them is a lost cause at this point…