Wired Security RSS on Nostr: A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions
Published at
2024-09-14 16:06:02Event JSON
{
"id": "515f5ec8b30a38c70947abcd9c3de491c718ee27b13a1fdfc85cad344b6c8162",
"pubkey": "0a96c51c0ca412c116c04f9bbb3c1338b1b19385e123682def6e7e803fdbf160",
"created_at": 1726329962,
"kind": 1,
"tags": [
[
"t",
"cybersecurity"
],
[
"t",
"security"
],
[
"t",
"infosec"
]
],
"content": "A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions\n\nhttps://www.wired.com/story/chatgpt-jailbreak-homemade-bomb-instructions/",
"sig": "3138d7fa4da569ac6b0dd99a1bb7d2be287124347b40524ddb81fd3c7f63b163bfa9d8b20fe7f1d0c3d05c306ecdaabd35327ec4cc30fe5463cfed74c729abce"
}