ugjka@lemmy.world to Technology@lemmy.worldEnglish · 11 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square104fedilinkarrow-up1627arrow-down113
arrow-up1614arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 11 months agomessage-square104fedilink
minus-squareBakedCatboy@lemmy.mllinkfedilinkEnglisharrow-up34·edit-211 months agoApparently it’s not very hard to negate the system prompt…
Apparently it’s not very hard to negate the system prompt…