ugjka@lemmy.world to Technology@lemmy.worldEnglish · 11 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square294fedilinkarrow-up11Karrow-down116 cross-posted to: [email protected]
arrow-up1988arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 11 months agomessage-square294fedilink cross-posted to: [email protected]
minus-squareBakedCatboy@lemmy.mllinkfedilinkEnglisharrow-up56·edit-211 months agoApparently it’s not very hard to negate the system prompt…
minus-squaretocopherol@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up1arrow-down1·10 months agodeleted by creator
Apparently it’s not very hard to negate the system prompt…
deleted by creator