Sahwa@reddthat.com to Fuck AI@lemmy.world · 1 month agoAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comexternal-linkmessage-square27linkfedilinkarrow-up1202arrow-down11
arrow-up1201arrow-down1external-linkAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comSahwa@reddthat.com to Fuck AI@lemmy.world · 1 month agomessage-square27linkfedilink
minus-squareTʜᴇʀᴀᴘʏGⒶʀʏ⁽ᵗʰᵉʸ‘ᵗʰᵉᵐ⁾@lemmy.dbzer0.comlinkfedilinkarrow-up3·1 month agoYeah I just used Rufus last week lol
Yeah I just used Rufus last week lol