Tixanou@lemmy.world to Lemmy Shitpost@lemmy.world · 5 months agoAI is the futurelemmy.worldimagemessage-square71fedilinkarrow-up1868arrow-down110
arrow-up1858arrow-down1imageAI is the futurelemmy.worldTixanou@lemmy.world to Lemmy Shitpost@lemmy.world · 5 months agomessage-square71fedilink
minus-squarekate@lemmy.uhhoh.comlinkfedilinkEnglisharrow-up2arrow-down3·5 months agoCan’t even rly blame the AI at that point
minus-squareTheFriar@lemm.eelinkfedilinkarrow-up12arrow-down1·5 months agoSure we can. If it gives you bad information because it can’t differentiate between a joke a good information…well, seems like the blame falls exactly at the feet of the AI.
minus-squarekate@lemmy.uhhoh.comlinkfedilinkEnglisharrow-up6arrow-down1·5 months agoShould an LLM try to distinguish satire? Half of lemmy users can’t even do that
minus-squareKevonLooney@lemm.eelinkfedilinkarrow-up9·5 months agoDo you just take what people say on here as fact? That’s the problem, people are taking LLM results as fact.
minus-squareBakerBagel@midwest.sociallinkfedilinkarrow-up4·5 months agoIt should if you are gonna feed it satire to learn from
minus-squarexavier666@lemm.eelinkfedilinkEnglisharrow-up2·5 months agoSarcasm detection is a very hard problem in NLP to be fair
minus-squareancap shark@lemmy.todaylinkfedilinkarrow-up2arrow-down1·5 months agoIf it’s being used to give the definite answer of a search, so it should. If it can, than it shouldn’t be used for that
Can’t even rly blame the AI at that point
Sure we can. If it gives you bad information because it can’t differentiate between a joke a good information…well, seems like the blame falls exactly at the feet of the AI.
Should an LLM try to distinguish satire? Half of lemmy users can’t even do that
Do you just take what people say on here as fact? That’s the problem, people are taking LLM results as fact.
It should if you are gonna feed it satire to learn from
Sarcasm detection is a very hard problem in NLP to be fair
If it’s being used to give the definite answer of a search, so it should. If it can, than it shouldn’t be used for that