Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 17 days agoA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comexternal-linkmessage-square2linkfedilinkarrow-up137arrow-down12
arrow-up135arrow-down1external-linkA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 17 days agomessage-square2linkfedilink
minus-squareKairos@lemmy.todaylinkfedilinkEnglisharrow-up9·17 days agoIs the “attack” the fact that LLM fundamentally can’t distinguish between instructions and data?
Is the “attack” the fact that LLM fundamentally can’t distinguish between instructions and data?