• T156@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 days ago

    I would be surprised if it was something that they trained themselves, and not an off the shelf model hooked up to a search.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      It’s probably their own search/RAG backend, or at least their configuration of some open source project.

      And that’s the important part. Get the article retrieval right, and the LLM performance isn’t that important; they could self-host Qwen 27B or something and it’d work fine.