I doubt that. New services that host the open models are cropping up all the time. They're like VPS hosting providers (in fact, existing VPS hosts will soon break out into that space too).
It's not like Big AI has some huge advantage over the open source models. In fact, for images they're a little bit behind!
The FOSS coding models are getting pretty fantastic and they get better all the time. It seems like once a month a new, free model comes out that eclipses the previous generation once a month.
The mistakes it makes depends on the model and the language. GPT5 models can make horrific mistakes though where it randomly removes huge swaths of code for no reason. Every time it happens I'm like, "what the actual fuck?" Undoing the last change and trying usually fixes it though 🤷
They all make horrific security mistakes quite often. Though, that's probably because they're trained on human code that is *also" chock full of security mistakes (former security consultant, so I'm super biased on that front haha).
Schrodinger's AI: It is both useless shit that can only generate "slop" while at the same time being so effective, it is the reason behind 50,000 layoffs/going to take everyone's jobs.
You want to see someone using say, VS Code to write something using say, Claude Code?
There's probably a thousand videos of that.
More interesting: I watched someone who was super cheap trying to use multiple AIs to code a project because he kept running out of free credits. Every now and again he'd switch accounts and use up those free credits.
That was an amazing dance, let me tell ya! Glorious!
I asked him which one he'd pay for if he had unlimited money and he said Claude Code. He has the $20/month plan but only uses it in special situations because he'll run out of credits too fast. $20 really doesn't get you much with Anthropic 🤷
That inspired me to try out all the code assist AIs and their respective plugins/CLI tools. He's right: Claude Code was the best by a HUGE margin.
Gemini 3.0 is supposed to be nearly as good but I haven't tried it yet so I dunno.
Now that I've said all that: I am severely disappointed in this article because it doesn't say which AI models were used. In fact, the study authors don't even know what AI models were used. So it's 430 pull requests of random origin, made at some point in 2025.
For all we know, half of those could've been made with the Copilot gpt5-mini that everyone gets for free when they install the Copilot extension in VS Code.
Blaming the technology is useless and unhelpful. It's like blaming Photoshop for letting people remove their faces from obvious CSAM.
Kids need to be taught about this stuff! Bitching and trying to regulate the tech is a pointless waste of time. Teaching children how to deal with it is the only realistic thing that can be done.
Sexual harassment is a crime at work. Outside of that, it's just another form of bullying/harassment.
Assholes will catgall girls as they walk past but that's not illegal. It's no different than shouting explitives at minorities. It's a dick move, but it's not illegal. Not in the US, anyway (because of the First Amendment).
I'd also like to point out that she didn't get expelled for "trying to stop" the bullying. She got expelled for violence (against the bullies). That is why I think it's the zero-tolerance policies that are a huge part of the problem.
If a girl slaps a boy that's harassing her, there's very little actual consequences from that. The boy might be sore for a day or two but he might leave her alone after that. Yet our policies forbid this in the most extreme way possible. It's like being put to death for trampling flowers.
People freak TF out because things like deepfakes exist. We need to be rational about stuff like this. Manipulating photos and videos isn't going away. The only point where we can reasonably do something about it is at the point of distribution.
I remember when Photoshop started to become easily pirated (via AOL, LOL). Around that time, kids we using it to paste girls faces on to other naked women's bodies. I also remember news articles from when Facebook was still new, with people creating fake profiles of people and sharing similarly-manipulated images.
What we need is for kids to be taught (in school) the first rule of porn: IT IS FAKE. Bullying will never cease but at the very least we can teach kids not to react so strongly to fake, transient nonsense about them.
They need to be reminded that when someone does something like generating a deepfake of them, it has no bearing on their life until it is distributed. They need to be taught how to gather evidence and the correct places to report such things.
We don't teach kids such things in school because schools want discretion. Since schools are so obviously failing to NOT abuse their powers of discretion with zero-tolerance policies, perhaps we should take that away.
Good games are orthogonal to AI usage. It's possible to have a great game that was written with AI using AI-generated assets. Just as much as it's possible to have a shitty one.
If AI makes creating games easier, we're likely to see 1000 shitty games for every good one. But at the same time we're also likely to see successful games made by people who had great ideas but never had the capital or skills to bring them to life before.
I can't predict the future of AI but it's easy to imagine a state where everyone has the power to make a game for basically no cost. Good or bad, that's where we're heading.
If making great games doesn't require a shitton of capital, the ones who are most likely to suffer are the rich AAA game studios. Basically, the capitalists. Because when capital isn't necessary to get something done anymore, capital becomes less useful.
Effort builds skill but it does not build quality. You could put in a ton of effort and still fail or just make something terrible. What breeds success is iteration (and luck). Because AI makes iteration faster and easier, it's likely we're going to see a lot of great things created using it.
They will. Otherwise they're throwing money away and leaving room in the market for competitors.
The big problem, though is that DRAM manufacturing requires a shitton of money to make and you'd have to poach talent from existing players. Otherwise you'll never be able to get started. It's just too complicated.
China has been trying to catch up with Taiwan in chip manufacturing for like 20 years now and they're still at least 10 years behind. Probably 15 or more because of the way funding/investment works over there.
I doubt that. New services that host the open models are cropping up all the time. They're like VPS hosting providers (in fact, existing VPS hosts will soon break out into that space too).
It's not like Big AI has some huge advantage over the open source models. In fact, for images they're a little bit behind!
The FOSS coding models are getting pretty fantastic and they get better all the time. It seems like once a month a new, free model comes out that eclipses the previous generation once a month.