shish_mish@lemmy.world to Technology@lemmy.worldEnglish · 9 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square24fedilinkarrow-up1295arrow-down14
arrow-up1291arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish@lemmy.world to Technology@lemmy.worldEnglish · 9 months agomessage-square24fedilink
minus-squarespujb@lemmy.cafelinkfedilinkEnglisharrow-up1arrow-down1·8 months agothe fact that you explained the problem doesn’t make it not a problem
the fact that you explained the problem doesn’t make it not a problem