Eh, LLMs do have a significant problem in how they can generate false information by themselves. Every other tool prior requires a person to make said false information, but LLMs can just generate it when asked a question
- Posts
- 0
- Comments
- 3
- Joined
- 3 yr. ago
- Posts
- 0
- Comments
- 3
- Joined
- 3 yr. ago
Last I checked, you kinda need lungs to breathe. And also last I checked, arms and legs aren't organs