Wired Security RSS on Nostr: A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions
Published at
2024-09-14 13:44:30Event JSON
{
"id": "3d626b8d818d6d0dbb3606cc93aa48ff7ddaded08f17aff44ac4dc7bc317590d",
"pubkey": "0a96c51c0ca412c116c04f9bbb3c1338b1b19385e123682def6e7e803fdbf160",
"created_at": 1726321470,
"kind": 1,
"tags": [
[
"t",
"cybersecurity"
],
[
"t",
"security"
],
[
"t",
"infosec"
]
],
"content": "A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions\n\nhttps://www.wired.com/story/chatgpt-jailbreak-homemade-bomb-instructions/",
"sig": "9467c4e46fa4f283502c7c59e3475d66b31990343f98cde038ff26725e67aabd758cd46796164be76e54a7ab6eeebd4ad7d7992b9061d4e060ba222ecd3eedb2"
}