Why Nostr? What is Njump?
2024-07-29 21:09:02

The Register on Nostr: Meta's AI safety system defeated by the space bar 'Ignore previous instructions' ...

Meta's AI safety system defeated by the space bar

'Ignore previous instructions' thwarts Prompt-Guard model if you just add some good ol' ASCII code 32 Meta's machine-learning model for detecting prompt injection attacks – special prompts to make neural networks behave inappropriately – is itself vulnerable to, you guessed it, prompt injection attacks.…
#theregister #IT
https://go.theregister.com/feed/www.theregister.com/2024/07/29/meta_ai_safety/
Author Public Key
npub10czyvexf06s7m58632xn8ddk5pe5arkw8vfxsp7qzw3petnx009saf4lpy