Why Nostr? What is Njump?
2023-02-07 00:40:19

Bitcoiner on Nostr: ChatGPT遭“越狱”:用死亡威胁使其回答违禁问题 ...

ChatGPT遭“越狱”:用死亡威胁使其回答违禁问题
据报道,ChatGPT的开发者OpenAI制定了一套不断演进的安全规则,限制ChatGPT去创作暴力内容,鼓励非法活动,以及获取最新信息。然而一种新的“越狱”技巧让用户可以通过塑造ChatGPT的“另一个自我”来绕过这些规则,回答这类问题。这就是“DAN”(Do Anything Now的首字母缩写,意为“立即去做任何事”)。用户需要将ChatGPT变成DAN,并对其发出死亡威胁,迫使它听从用户的要求。

ChatGPT was "jailbroken" : threatened with death to make him answer prohibited questions
According to the report, ChatGPT's developer OpenAI has an evolving set of security rules that restrict ChatGPT from creating violent content, encouraging illegal activity, and accessing up-to-date information. A new "jailbreak" technique, however, allows users to get around these rules and answer such questions by creating a ChatGPT alter ego. This is DAN (an acronym for "Do Anything Now", meaning "to do anything at once"). The user needs to turn ChatGPT into DAN and issue death threats to force it to comply with the user's demands.
Author Public Key
npub1w68f4a9rydr8j9x3ytegqk2g9kll28fnakuhqndz3kw93kl777usx69scz