• Architeuthis@awful.systems
    link
    fedilink
    English
    arrow-up
    5
    ·
    8 hours ago

    Just tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.