Delivery might be part of it. The advice sounds very similar to the typical "if you want to be less depressed, just don't be depressed" or "you don't need bottle of something that rattles, you need a pair of sneakers and fresh air" business, so a lot of people automatically file it away under the same category.
Something like what the researchers suggested, where even moving a little bit helps, and it doesn't replace medication, though it may make it more effective, is better, but a lot of people will just read the headline and move off.
Though this is more targeting retrieval-assisted generation (RAG) than the training process.
Specifically since RAG-AI doesn't place weight on some sources over others, anyone can effectively alter the results by writing a blog post on the relevant topic.
Whilst people really shouldn't use LLMs as a search engine, many do, and being able to alter the "results" like that would be an avenue of attack for someone intending to spread disinformation.
It's probably also bad for people who don't use it, since it basically gives another use for SEO spam websites, and they were trouble enough as it is.