A 13-year-old student was expelled from a Louisiana middle school after hitting a male classmate who she said created and shared a deepfake pornographic image of her, according to her family’s lawyers.

  • Perspectivist@feddit.uk
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Technically speaking there is no such thing as child porn - it’s abuse material i.e. evidence of a crime. However, there has been no crime when the content is AI generated so it would be categorized as simulated abuse material.

    Child porn as a term shouldn’t really be used at all. It downplays what said content actually is. It’s similar to calling female genital mutilation a “female circumcison”.

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      5 months ago

      Child porn as a term shouldn’t really be used at all.

      This is, linguistically, an unwinnable fight, imo. People understand what “porn” is(/is meant to be), and ‘child’ is just a descriptor. People are never naturally going to start saying “abuse material” instead of “porn” in instances like these.

      We can’t even get people to consistently say STI instead of STD after all this time. You’ve got to pick your battles, lol.