SnokenKeekaGuard@lemmy.dbzer0.comM to Aneurysm Posting@sopuli.xyz · 22 days agostuttering search enginelemmy.dbzer0.comimagemessage-square9linkfedilinkarrow-up1112arrow-down11
arrow-up1111arrow-down1imagestuttering search enginelemmy.dbzer0.comSnokenKeekaGuard@lemmy.dbzer0.comM to Aneurysm Posting@sopuli.xyz · 22 days agomessage-square9linkfedilink
minus-squareEpeeGnome@feddit.onlinelinkfedilinkEnglisharrow-up19·22 days agoI couldn’t replicate it, but I’ve seen LLMs fail this way before, so it’s very plausible.
minus-squareWilldrick@lemmy.worldlinkfedilinkarrow-up4·21 days agoYup, too cold temp does that, and repeats the last token/word ad infinitum
minus-squareokwhateverdude@lemmy.worldlinkfedilinkEnglisharrow-up6·22 days agoBelievable given how robots spit out tokens.
minus-squareZorcron@lemmy.ziplinkfedilinkEnglisharrow-up4arrow-down1·21 days agoI suppose it’s possible, but I’m always suspicious of screenshots like this just because they’d be so easy to fake, either by giving a different prompt than in the screenshot or by using inspect element to completely edit the response text.
This is real?
I couldn’t replicate it, but I’ve seen LLMs fail this way before, so it’s very plausible.
Yup, too cold temp does that, and repeats the last token/word ad infinitum
Believable given how robots spit out tokens.
I suppose it’s possible, but I’m always suspicious of screenshots like this just because they’d be so easy to fake, either by giving a different prompt than in the screenshot or by using inspect element to completely edit the response text.