• undefinedTruth@lemmy.zip
    link
    fedilink
    arrow-up
    6
    ·
    23 days ago

    You give it too much credit. In order to be able to lie it would first need do be actually capable of understanding what it writes. LLMs are text prediction algorithms. They cannot think.