Wired Security RSS on Nostr: A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions
Published at
2024-09-14 15:55:55Event JSON
{
"id": "584b6a30003f622af6ba691da17360478be2e648633224ab53af475dc58e8760",
"pubkey": "0a96c51c0ca412c116c04f9bbb3c1338b1b19385e123682def6e7e803fdbf160",
"created_at": 1726329355,
"kind": 1,
"tags": [
[
"t",
"cybersecurity"
],
[
"t",
"security"
],
[
"t",
"infosec"
]
],
"content": "A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions\n\nhttps://www.wired.com/story/chatgpt-jailbreak-homemade-bomb-instructions/",
"sig": "24024e98ff93f9af93ae6de8ca671a943ec50420d267cf7bc0743a619649ebff9a6621938d90551d7f07f3157c6aa87e3b071a7a673eb8af60cd4f68db67b3bf"
}