Wired Security RSS on Nostr: A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions
Published at
2024-09-14 19:18:07Event JSON
{
"id": "b80bc19b5b4999164fd9e736478a615602517c3563c23febf7dff6821addaa21",
"pubkey": "0a96c51c0ca412c116c04f9bbb3c1338b1b19385e123682def6e7e803fdbf160",
"created_at": 1726341487,
"kind": 1,
"tags": [
[
"t",
"cybersecurity"
],
[
"t",
"security"
],
[
"t",
"infosec"
]
],
"content": "A Creative Trick Makes ChatGPT Spit Out Bomb-Making Instructions\n\nhttps://www.wired.com/story/chatgpt-jailbreak-homemade-bomb-instructions/",
"sig": "6a2bbf5469a31f2b58c29357358e47e6f879b93da2750f3b88204c38dc628c79e9e3f1c5b8bfad85b8c60115bc2456559ffc6763948126af037f2fc31815889f"
}