Wired Security RSS on Nostr: A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions
Published at
2024-09-14 14:14:50Event JSON
{
"id": "1ceee4b3eebc66f821821cd6caff9b119e811118681f077f8b47b9c781d895ba",
"pubkey": "0a96c51c0ca412c116c04f9bbb3c1338b1b19385e123682def6e7e803fdbf160",
"created_at": 1726323290,
"kind": 1,
"tags": [
[
"t",
"cybersecurity"
],
[
"t",
"security"
],
[
"t",
"infosec"
]
],
"content": "A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions\n\nhttps://www.wired.com/story/chatgpt-jailbreak-homemade-bomb-instructions/",
"sig": "5198c0693408aa2521ee2d7a171ed208a1e7148b660aabb6cfefa289c14657151df395cf212235b80802a37119d79a90d6c2ece63690ddb36057fdec4b2f7b7f"
}