That’s a way bigger story. Open Source projects need at least a “you have to tell us if you used AI” policy. Even then slop reviews and slop PRs are a recipe for disaster.
Beyond the ethical, environmental, and legal issues, it just means you’re more likely to get complacent. Most projects have some sort of BDFL that guides the overall vision, and if that gets co-opted by a system designed to trick people into thinking it’s smart you’re fucked.
That’s a way bigger story. Open Source projects need at least a “you have to tell us if you used AI” policy. Even then slop reviews and slop PRs are a recipe for disaster.
Beyond the ethical, environmental, and legal issues, it just means you’re more likely to get complacent. Most projects have some sort of BDFL that guides the overall vision, and if that gets co-opted by a system designed to trick people into thinking it’s smart you’re fucked.