There's a story about a guy who asked his LLM to remind him to do something in the morning, and it ended up burning quite a lot of money checking to see if daylight had broken once every 30 minutes with an unnecessary API call. Such is the supposed helpful assistant.
- Posts
- 27
- Comments
- 335
- Joined
- 8 mo. ago
- Posts
- 27
- Comments
- 335
- Joined
- 8 mo. ago
It's a metaphor for the cooked humans that are spinning up super exploitable chatbots for it