Hello all, long time lurker, sometimes poster. In my line of work some of my co workers seem eager to turn to the clanker to get an instant answer to any road block. I feel its better to problem solve the old fashioned way. With some good old research and finding a blog that is not AI slop LOL. Do those of you in a support role find any peer pressure to use LLMs?

  • bungusbread@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 day ago

    You would be correct. They hallucinate anywhere from 5-50%+ of the time depending on task complexity and there’s literally no way to make them stop. Relying on them for information is genuinely akin to asking a paranoid schizophrenic living in your pocket what they think you should do.