• 11 Posts
  • 997 Comments
Joined 2 years ago
cake
Cake day: August 15th, 2023

help-circle

  • So here’s what I don’t get. LLMs were trained on data from places like SO. SO starts losing users ,and thus content. Content that LLMs ingest to stay relevant.

    So where will LLMs get their content after a certain point? Especially for new things that may come out or unique situations. It’s not like it’ll scrape the answer from a web page if people are just asking LLMs.

















  • To vintage’s point. The way I view it is there is no chance for AGI via the current method of hopped up LLM/ML but that doesn’t mean we won’t uncover a method in the future. Bio-engineering with an attempt to recreate a neural network for example, or extraction of neurons via stem cells with some sort of electrical interface. My initial point was that it’s way off, not that it’s impossible. One day someone will go “well, that’s interesting” and we’ll have a whole new paradigm