• 11 Posts
  • 791 Comments
Joined 2 years ago
cake
Cake day: June 30th, 2023

help-circle










  • Press A to doubt.

    LLMs do not have volition, even “reasoning” models. This is still super clippy: you can ask it all you want about the office suite, but anything outside of that context is nonsense to even talk about. Tone or lying or perceived intent or any of that is an artifact of training/fine tuning/distillation. Reasoning logic itself is just trained using good solutions, bad solutions, right answers, wrong answers and “truthiness”. It’s not easy to manufacture a model that is by and large coherent, but in no way is anybody putting forth a path to generalized intelligence which is a way, WAY harder problem.