return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 7 hours agoAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comexternal-linkmessage-square20fedilinkarrow-up1114arrow-down15cross-posted to: [email protected]
arrow-up1109arrow-down1external-linkAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 7 hours agomessage-square20fedilinkcross-posted to: [email protected]
minus-square𝓹𝓻𝓲𝓷𝓬𝓮𝓼𝓼@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up7·4 hours agodoesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought) any money says they’re vulnerable to prompt injection in the comments and posts of the site
minus-squareBradleyUffner@lemmy.worldlinkfedilinkEnglisharrow-up7·4 hours agoThere is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
doesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought)
any money says they’re vulnerable to prompt injection in the comments and posts of the site
There is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.