Wired Security RSS on Nostr: A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions
Published at
2024-09-14 13:54:37Event JSON
{
"id": "0a81411f5e46503693ba56f64a89cbbe00058cd4008fb05d7027df9409f9eebc",
"pubkey": "0a96c51c0ca412c116c04f9bbb3c1338b1b19385e123682def6e7e803fdbf160",
"created_at": 1726322077,
"kind": 1,
"tags": [
[
"t",
"cybersecurity"
],
[
"t",
"security"
],
[
"t",
"infosec"
]
],
"content": "A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions\n\nhttps://www.wired.com/story/chatgpt-jailbreak-homemade-bomb-instructions/",
"sig": "ff9c7fae456d16b6e010ae5f1329307d028f2377770bd64661d783566c70301b4d30c5171b51e42d44cdf1d20c190f84f74d8e162a87c7c3647220f21e630b79"
}