It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law, when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around.

  • antidote101@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    8 months ago

    They don’t recall information from a repository, the repository is translated into a set of topic based weighted probabilities of what words come next.

    Those probabilities are then used to reconstruct a best-guess at what words are next when generating strings of language.

    It’s not recall, it’s a form of “free” association, which is quite tightly bounded to the context, topic, and weightings of the training data.

    This is not precise and is more likely to create average answers and sentences, rather than precise ones.

    It’s not recall, it’s really convincing lies.

    “He seems to know what he’s talking about, and speaks with a certain kind of authority which makes sense and sounds knowledgeable”.