AI companies are training models on photos and texts posted only for your friends to see in their networks, and worse, also on e-mails, personal images people are backing up, etc. That's private information. It shouldn't be used for training models.
With public information that everyone can see it's from my point of view a gray area. If a magazine takes a public photo and uses it to sell copies, they're stealing from the artist. But if they take that same photo and use it to train and sell an AI model, it's a difficult situation to assess. I think our best approach so far is to respect the author's wishes if they explicitly want to opt out. And yes of course I believe in intellectual property and copyright, if that was your question. They're there for a reason, and they not only benefit big corporations but also small and independent artists and content creators.
https://www.theverge.com/meta/694685/meta-ai-camera-roll
Just a recent example. Of course they're vague about what "public" means, but if you really believe they aren't using all the photos, you'd be pretty naive in my eyes.
If that's what you want to call conservative go ahead, although it's not what I'd typically associate with that word. Not sure where you see the problem? What does taxing wealth at increasing rates to decrease inequality have to do with enforcing intellectual property to protect intellectual workers?