A 13-year-old student was expelled from a Louisiana middle school after hitting a male classmate who she said created and shared a deepfake pornographic image of her, according to her family’s lawyers.
A 13-year-old student was expelled from a Louisiana middle school after hitting a male classmate who she said created and shared a deepfake pornographic image of her, according to her family’s lawyers.
Technically speaking there is no such thing as child porn - it’s abuse material i.e. evidence of a crime. However, there has been no crime when the content is AI generated so it would be categorized as simulated abuse material.
Child porn as a term shouldn’t really be used at all. It downplays what said content actually is. It’s similar to calling female genital mutilation a “female circumcison”.
This is, linguistically, an unwinnable fight, imo. People understand what “porn” is(/is meant to be), and ‘child’ is just a descriptor. People are never naturally going to start saying “abuse material” instead of “porn” in instances like these.
We can’t even get people to consistently say STI instead of STD after all this time. You’ve got to pick your battles, lol.