A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • ContrarianTrail@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    I don’t think that’s fair. It could just as well be said that the purpose of violent games is to simulate real life violence.

    Even if I grant you that the purpose of viewing CSAM is to see child abuse, it’s still less bad than actually abusing them just like playing violent games is less bad than participating in real violence. Also, despite the massive increase in violent games and movies, the actual number of violence is going down so implying that viewing such content would increase the cases of child abuse is an assumption I’m not willing to make either.

    • Leraje@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      4 months ago

      The purpose of a game is to play a game through a series of objectives and challenges.

      Even if I grant you that the purpose of viewing CSAM is to see child abuse

      Very curious to hear what else you think the purpose of watching CSAM might be.

      it’s still less bad than actually abusing them

      “less bad” is relative. A bad thing is still bad. If we go by length of sentencing then rape is ‘less bad’ than murder. that doesn’t make it ‘not bad’.

      so implying that viewing such content would increase the cases of child abuse is an assumption I’m not willing to make either.

      OK?

      I didn’t claim that AI CSAM increased anything at all. Literally all I’ve said is that the purpose of AI generated CSAM is to watch kids being abused.

      Neither did I claim that violent games lead to violence. You invented that strawman all by yourself.

      • ContrarianTrail@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        4 months ago

        A person said that there is no victim in creating simulated CSAM with AI just like there isn’t one in video games, to which you replied that the difference there is intention. The intention to play violent games is to play games when as with viewing CSAM it’s that your intention is to view abuse material.

        Correct so far?

        Ofcourse the intent is that. For what other reason would anyone want to see CSAM for, than to see CSAM? What kind of argument / conclusion is this supposed to be? How else am I supposed to interpret this than as you advocating for the crimimalization of creating such content despite the fact that no one is being harmed? How is that not pre-emptively punishing people for crimes they’ve yet to even commit? Nobody chooses to be born with such thoughts or desires, so I don’t see the point of punishing anyone for that alone.

        • Leraje@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          4 months ago

          I’ve literally got no idea what you’re talking about or what your point is. Are you saying this person hasn’t committed a crime? Because that’s incorrect. Lots of jurisdictions have laws preventing things like CSAM generated imagery, deepfake porn and a whole raft of other things. ‘Harm’ doesn’t begin and end with something done to an individual for a lot of crimes.

          • ContrarianTrail@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            Are you saying this person hasn’t committed a crime?

            Yes, and if the law is interpretet in a way that it is considered illegal, and the person is punished for it, then that’s a moral injustice and the kind of senselessness we as humans should grow out of. The fact that this “crime” has no victim is the whole point of why punishing for it makes no sense.

            CSAM is illegal for a very good reason; producing it without abusing children is by definition impossible. By searching for and viewing such content, the person becomes part of the causal chain that leads to it being produced in the first place. By criminalizing it we attempt to deter people from looking for it and thus bringing down the demand and disincentivizing the production of it.

            Using AI that is not trained on such content is out of this loop. There is literally nobody being harmed if someone decides to use it to create depictions of such content. It’s not actual CSAM it’s producing. By the very definition it cannot be. Not any more than shooting a person in a video game is a murder. CSAM stands for Child Sexual Abuse Material (I hate even saying that) so in other words; proof of the crime having happened. AI generated images are fiction. Nobody is being harmed. It’s just a more photorealistic version of a drawing. Treating it as actual CSAM in the court is insanity.

            Now. If the AI has been trained on actual CSAM and especially if the output simulates real people, then that’s a whole another discussion to be had. This is however not what we’re talking about here.