Technology is always progressing but nobody can say what the next big thing will be, if you really think you are that prescient you can make loads of cash predicting things. Companies are hungry for the next big thing though and will do everything to convince us that they have it, AI is an enticing grift because it's so misunderstood. The next big thing wasn't AR or VR or the metaverse, and I don't think it's going to be generative AI either, it's already plateauing and not profitable, even with billions of dollars behind it.
I wouldn't advocate for someone eating palm oil simply for their own personal health. However if you want to talk about the environment way more land is cleared for livestock than oil palm, even if you just focus on the locations where oil palm is grown. And palm oil is usually replacing animal fats in cooking due to it's saturated fat content, stuff like lard and ghee.
Something like Microsoft Word or Paint is not generative.
It is standard for publishers to make indemnity agreements with creatives who produce for them, because like I said, it's kinda difficult to prove plagiarism in the negative so a publisher doesn't want to take the risk of distributing works where originality cannot be verified.
I'm not arguing that we should change any laws, just that people should not use these tools for commercial purposes if the producers of these tools will not take liability, because if they refuse to do so their tools are very risky to use.
I don't see how my position affects the general public not using these tools, it's purely about the relationship between creatives and publishers using AI tools and what they should expect and demand.
Anyway, as a publisher, if I cannot get OpenAI/ChatGPT to sign an indemnity agreement where they are at fault for plagiarism then their tool is effectively useless because it is really hard to determine something in not plagiarism. That makes ChatGPT pretty sus to use for creatives. So who is going to pay for it?
If everyone got a lucky number tattoo before they could even talk, something nonconsensual and superstitious, some people would end up liking their tattoo or not caring either way. Such a person can still find the practice wrong, horrific even. If you have personal trauma it does not justify assuming people's positions and calling them shitheads.
While I agree that using copyrighted material to train your model is not theft, text that model produces can very much be plagiarism and OpenAI should be on the hook when it occurs.
It's not hypocritical to care about some parts of copyright and not others. For example most people in the foss crowd don't really care about using copyright to monetarily leverage being the sole distributor of a work but they do care about attribution.
We are allergic to exploiting great solutions that already exist. Everyone wants to be "disruptive".
It reminds me of the investment that went into hyperloop stuff when our current best transit solutions aren't anywhere close to full saturation in the US. Similarly our current best green technologies are far from being fully exploited.
That's not important. I was illustrating that clearly if nobody ate chicken nobody would harvest chickens for food. Unless you think that the same amount of chickens will be harvested until the very last human gives up chicken then you have to acknowledge that the individual consumer does make a difference.
If you don't eat chicken nobody is going to swoop in and eat all the chicken you don't eat. However if a farmer or farming corporation decides to stop harvesting chickens then it's almost certain some entity will swoop in to replace them in the market. So acting like the consumer here is not one of the if not the most important part in this causal chain is just naive.
Are we allowed to kink shame whatever this is?