Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 12 days agoA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comexternal-linkmessage-square2linkfedilinkarrow-up137arrow-down12
arrow-up135arrow-down1external-linkA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 12 days agomessage-square2linkfedilink
minus-squareKairos@lemmy.todaylinkfedilinkEnglisharrow-up8·12 days agoIs the “attack” the fact that LLM fundamentally can’t distinguish between instructions and data?