• Architeuthis@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    12 days ago

    Just tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.