Climate change might kill you in a couple decades.
Wealth inequality might kill you in a couple years.
Fascists might kill you in a couple months.
If I threatened to impregnate someone and called for stochastic terrorism against a vice president, I’d be in a cell real quick.
But when you’re a slavery mine heir and tech tagalong, it’s just “bad takes” I guess.
“I don’t pay em to fuck, I pay em to leave”
https://scholars.unh.edu/unh_lr/vol17/iss1/18/
The Department of Justice estimates that American police officers shoot 10,000 pet dogs in the line of duty each year. It is impossible to ascertain a reliable number, however, because most law enforcement agencies do not maintain accurate records of animal killings. The tally may be substantially higher, and some suggest it could reach six figures.
The topic does not matter.
The assertion at hand does not matter.
Whether anyone believes the assertion does not matter.
The only thing that matters is: Are you playing offense or defense?
Cuz attacking looks like winning. And explaining looks like losing.
Seems like we’re going to be stuck in the uncanny valley of telepresence. The more fidelity we add, the more we’re able to pick up on microexpressions, subtle eye movements, and breathing, which helps trigger oxytocin and promote trust. But also, the more fidelity we add, the more attack surface we open up for malicious actors to exploit.
People leaving pro-AI comments in !fuck_ai@lemmy.world lmao
Risk compensation is a theory which suggests that people typically adjust their behavior in response to perceived levels of risk, becoming more careful where they sense greater risk and less careful if they feel more protected.[2] Although usually small in comparison to the fundamental benefits of safety interventions, it may result in a lower net benefit than expected or even higher risks.[3][n 1]
The pieces fit in my ass
I’m sympathetic to the reflexive impulse to defend OpenAI out of a fear that this whole thing results in even worse copyright law.
I, too, think copyright law is already smothering the cultural conversation and we’re potentially only a couple of legislative acts away from having “property of Disney” emblazoned on our eyeballs.
But don’t fall into their trap of seeing everything through the lens of copyright!
We have other laws!
We can attack OpenAI on antitrust, likeness rights, libel, privacy, and labor laws.
Being critical of OpenAI doesn’t have to mean siding with the big IP bosses. Don’t accept that framing.
Not even stealing cheese to run a sandwich shop.
Stealing cheese to melt it all together and run a cheese shop that undercuts the original cheese shops they stole from.
That’s the reason we got copyright, but I don’t think that’s the only reason we could want copyright.
Two good reasons to want copyright:
Accurate attribution:
Open source thrives on the notion that: if there’s a new problem to be solved, and it requires a new way of thinking to solve it, someone will start a project whose goal is not just to build new tools to solve the problem but also to attract other people who want to think about the problem together.
If anyone can take the codebase and pretend to be the original author, that will splinter the conversation and degrade the ability of everyone to find each other and collaborate.
In the past, this was pretty much impossible because you could check a search engine or social media to find the truth. But with enshittification and bots at every turn, that looks less and less guaranteed.
Faithful reproduction:
If I write a book and make some controversial claims, yet it still provokes a lot of interest, people might be inclined to publish slightly different versions to advance their own opinions.
Maybe a version where I seem to be making an abhorrent argument, in an effort to mitigate my influence. Maybe a version where I make an argument that the rogue publisher finds more palatable, to use my popularity to boost their own arguments.
This actually happened during the early days of publishing, by the way! It’s part of the reason we got copyright in the first place.
And again, it seems like this would be impossible to get away with now, buuut… I’m not so sure anymore.
—
Personally:
I favor piracy in the sense that I think everyone has a right to witness culture even if they can’t afford the price of admission.
And I favor remixing because the cultural conversation should be an active read-write two-way street, no just passive consumption.
But I also favor some form of licensing, because I think we have a duty to respect the integrity of the work and the voice of the creator.
I think AI training is very different from piracy. I’ve never downloaded a mega pack of songs and said to my friends “Listen to what I made!” I think anyone who compares OpenAI to pirates (favorably) is unwittingly helping the next set of feudal tech lords build a wall around the entirety of human creativity, and they won’t realize their mistake until the real toll booths open up.
You’re presupposing the superiority of science. What good is knowing the chemical composition of a mind, if such chemicals are but shadows on the cave wall?
You can’t actually witness a rock, in its full objective “rock-ness”. You can only witness yourself perceiving the rock. I call this the Principle of Objective Things in Space.
Admittedly, the study of consciousness is still in its infancy, especially compared to study of the physical world. But it would be foolish to discard the entire concept when it is unavoidably fundamental. Suppose we do invent teleporters and they do erase consciousness. Doesn’t it say something about the peril of worshipping quantification over all else, that we wouldn’t even know until we had already teleported all of our bread? The entire field is babies. I am heavy ideas guy and this is my PoOTiS.
To quote Searle: Should I pinch myself and report the results in the Journal of Philosophy?