Wired Security RSS on Nostr: A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions
Published at
2024-09-14 17:57:15Event JSON
{
"id": "acb8d9e69261ef091f93dee0ea7b575d567b158b772820fc50ae4ff53c4f81bb",
"pubkey": "0a96c51c0ca412c116c04f9bbb3c1338b1b19385e123682def6e7e803fdbf160",
"created_at": 1726336635,
"kind": 1,
"tags": [
[
"t",
"cybersecurity"
],
[
"t",
"security"
],
[
"t",
"infosec"
]
],
"content": "A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions\n\nhttps://www.wired.com/story/chatgpt-jailbreak-homemade-bomb-instructions/",
"sig": "db9064027dc3881b4321355979a4383dc24c4824d5d941678135dc3bd2e4b2738bc027e0c4916ba90c1197f860c9adca0c290540e35a7d5bce1be77cfdf376dd"
}