misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 年前Jailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square80fedilinkarrow-up1469arrow-down117cross-posted to: [email protected][email protected]
arrow-up1452arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 年前message-square80fedilinkcross-posted to: [email protected][email protected]
minus-squareThe Barto@sh.itjust.workslinkfedilinkEnglisharrow-up3·edit-21 年前Legitimate reason? No, but there’s always a reason to know how to make napalm.
Legitimate reason? No, but there’s always a reason to know how to make napalm.