It spouts out generic and outdated answers when asked specific questions, which I can identify as wrong (skill issue, lol). If you are super confident with using them, maybe you are really not knowledgeable enough about those things. Skill issue, I guess.
That is my whole point about investors, that smart money does not take part in such things for so long. They see through the bullshit.
So if I am understanding it, LLMs is not using the easier option of reverse image search because it is not aware of them?
Crypto is as valuable as you want it to be because it is not real. Not being real is a big problem for a currency.
You are getting more surface level information from it which is probably going to be correct unless there is a major problem in training data.
I still think they are good. Isolated incidents like this are going to happen when you are doing business at such scales.
The confidence is the problem. If a human does not know the answer, they say that they don’t know. LLMs seem to not know that it is an option.
It spouts out generic and outdated answers when asked specific questions, which I can identify as wrong (skill issue, lol).
If you are super confident with using them, maybe you are really not knowledgeable enough about those things. Skill issue, I guess.