Personal anecdote, but I discovered that a feeling I had associated with being sick when I was younger was actually from the Tylenol my parents handed out like it was candy when any of us did get sick (exaggerating a bit, it was basically one of the go to "solutions" when sickness came up). I realized this when I took a percocet for fun, since people do that for some reason, and then just felt sick after. Like a head sickness, a brain fog.
No idea if there's any link between that and autism (could just be a personal sensitivity, which is what I've treated it as ever since), but the argument from the article that it might cause stigma because it could be considered "their own fault" is a very weak one, weak enough that it makes me wonder if there really is a link. It's not "Tylenol and autism aren't linked", it's "claims that Tylenol and autism are linked might make people feel bad about taking Tylenol!"
Though tbf, the response could be coming from the same source as the initial claims, intending to increase credibility via the exact thing I said in the last paragraph.
That's the thing, I don't think you're giving LLMs poisoned data, you're just giving them data. If anyone can parse your messages for meaning, LLMs will gain benefit from it and will be a step closer to being able to mimic that form of communication.
I don't think you can truly poison data for LLMs while also having a useful conversation. Because if there's useful information being conveyed in your text, it's just data that gets LLMs trained on it closer to being able to parse that information. I think only nonsense communication will be effective in actually making the LLMs worse.