• lenz@lemmy.ml
    link
    fedilink
    English
    arrow-up
    31
    ·
    edit-2
    7 hours ago

    I read the article. This is exactly what happened when my best friend got schizophrenia. I think the people affected by this were probably already prone to psychosis/on the verge of becoming schizophrenic, and that ChatGPT is merely the mechanism by which their psychosis manifested. If AI didn’t exist, it would’ve probably been Astrology or Conspiracy Theories or QAnon or whatever that ended up triggering this within people who were already prone to psychosis. But the problem with ChatGPT in particular is that is validates the psychosis… that is very bad.

    ChatGPT actively screwing with mentally ill people is a huge problem you can’t just blame on stupidity like some people in these comments are. This is exploitation of a vulnerable group of people whose brains lack the mechanisms to defend against this stuff. They can’t help it. That’s what psychosis is. This is awful.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      6 hours ago

      the problem with ChatGPT in particular is that is validates the psychosis… that is very bad.

      So do astrology and conspiracy theory groups on forums and other forms of social media, the main difference is whether you’re getting that validation from humans or a machine. To me, that’s a pretty unhelpful distinction, and we attack both problems the same way: early detection and treatment.

      Maybe computers can help with the early detection part. They certainly can’t do much worse than what’s currently happening.

      • lenz@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 hours ago

        I think having that kind of validation at your fingertips, whenever you want, is worse. At least people, even people deep in the claws of a conspiracy, can disagree with each other. At least they know what they are saying. The AI always says what the user wants to hear and expects to hear. Though I can see how that distinction may matter little to some, I just think ChatGPT has advantages that are worse than what a forum could do.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          31 minutes ago

          Sure. But on the flip side, you can ask it the opposite question (tell me the issues with <belief>) and it’ll do that as well, and you’re not going to get that from a conspiracy theory forum.

    • Maeve@kbin.earth
      link
      fedilink
      arrow-up
      4
      ·
      7 hours ago

      I think this is largely people seeking confirmation their delusions are real, and wherever they find it is what they’re going to attach to themselves.