David Gerard@awful.systemsM to TechTakes@awful.systemsEnglish · 1 day agoAI coding bot allows prompt injection with a pull requestpivot-to-ai.comexternal-linkmessage-square2linkfedilinkarrow-up124arrow-down10file-text
arrow-up124arrow-down1external-linkAI coding bot allows prompt injection with a pull requestpivot-to-ai.comDavid Gerard@awful.systemsM to TechTakes@awful.systemsEnglish · 1 day agomessage-square2linkfedilinkfile-text
minus-squareArchiteuthis@awful.systemslinkfedilinkEnglisharrow-up5·8 hours agoJust tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.
Just tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.