return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square33linkfedilinkarrow-up1210arrow-down17cross-posted to: fuck_ai@lemmy.world
arrow-up1203arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square33linkfedilinkcross-posted to: fuck_ai@lemmy.world
minus-squareRisingSwell@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·7 months agoIt’s really easy to make explosives. Making them stable and reliable is the hard part.
It’s really easy to make explosives. Making them stable and reliable is the hard part.