The vast majority of people don’t think in legal terms, and it’s always possible for something to be both legal and immoral. See: slavery, the actions of the third reich, killing or bankrupting people by denying them health insurance… and so on.
There are teenagers, even children, who posted works which have been absorbed into AI training without their awareness or consent. Are literal children to blame for not understanding laws that companies would later abuse when they just wanted to share and participate in a community?
And AI companies aren’t using merely licensed material, they’re using everything they can get their hands on. If they’re pirating you bet your ass they’ll use your nudes if they find them, public domain or not. Revenge porn posted by an ex? Straight into the data.
So your argument is:
It’s legal
But:
What’s legal isn’t necessarily right
You’re blaming children before companies
AI makers actually use illegal methods, too
It’s closer to victim blaming than you think.
The law isn’t a reliable compass for what is or isn’t right. When the law is wrong, it should be changed. IP law is infamously broken in how it advantages and gets (ab)used by companies. For a few popular examples: see how youtube mediates between companies and creators, nintendo suing everyone they can (costs victims more than it does nintendo), everything disney did to IP legislation.
Okay but I wasn’t arguing morality or about children posting nudes of themselves. I’m just telling you that works submitted into the public domain can’t be retracted and there are models trained on exclusively open data, which a lot of AI haters don’t know, understand or won’t acknowledge. That’s all I’m saying. AI is not bad, corporations make it bad.
The law isn’t a reliable compass for what is or isn’t right.
Fuck yea it ain’t, I’m the biggest copyright and IP law hater on this platform and I’ll get ahead of the next 10 replies by saying no it’s not because I want to enable mindless corporate content scraping; it’s because human creativity shouldn’t not be boxed in. It should be shared freely, lest our culture be lost.
The vast majority of people don’t think in legal terms, and it’s always possible for something to be both legal and immoral. See: slavery, the actions of the third reich, killing or bankrupting people by denying them health insurance… and so on.
There are teenagers, even children, who posted works which have been absorbed into AI training without their awareness or consent. Are literal children to blame for not understanding laws that companies would later abuse when they just wanted to share and participate in a community?
And AI companies aren’t using merely licensed material, they’re using everything they can get their hands on. If they’re pirating you bet your ass they’ll use your nudes if they find them, public domain or not. Revenge porn posted by an ex? Straight into the data.
So your argument is:
But:
It’s closer to victim blaming than you think.
The law isn’t a reliable compass for what is or isn’t right. When the law is wrong, it should be changed. IP law is infamously broken in how it advantages and gets (ab)used by companies. For a few popular examples: see how youtube mediates between companies and creators, nintendo suing everyone they can (costs victims more than it does nintendo), everything disney did to IP legislation.
Okay but I wasn’t arguing morality or about children posting nudes of themselves. I’m just telling you that works submitted into the public domain can’t be retracted and there are models trained on exclusively open data, which a lot of AI haters don’t know, understand or won’t acknowledge. That’s all I’m saying. AI is not bad, corporations make it bad.
Fuck yea it ain’t, I’m the biggest copyright and IP law hater on this platform and I’ll get ahead of the next 10 replies by saying no it’s not because I want to enable mindless corporate content scraping; it’s because human creativity shouldn’t not be boxed in. It should be shared freely, lest our culture be lost.