Wired Security RSS on Nostr: A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions
Published at
2024-09-14 14:35:03Event JSON
{
"id": "e661cf31ff7712cad261c2b155fd13b436bd0c76bef0aee8c344582bfc7edbd7",
"pubkey": "0a96c51c0ca412c116c04f9bbb3c1338b1b19385e123682def6e7e803fdbf160",
"created_at": 1726324503,
"kind": 1,
"tags": [
[
"t",
"cybersecurity"
],
[
"t",
"security"
],
[
"t",
"infosec"
]
],
"content": "A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions\n\nhttps://www.wired.com/story/chatgpt-jailbreak-homemade-bomb-instructions/",
"sig": "8fbcd92cc0148fc5b47db3c30afdcd4209360e72a6b44a6235f5ee770842dd7131c7827b41b12a005229f6e32bfae1b8f1aeca21b6fbd670a1b34901c205ef14"
}