Vedran on Nostr: Well, it's happening! Did you ever try to ask some AI chatbot like Bard, OpenAI some ...
Well, it's happening! Did you ever try to ask some AI chatbot like Bard, OpenAI some question, and get confident and argumented answer you checked afterwards and it turned out it is completly or partly false. That behaviour in AI is called "AI hallucination". I'm glad I now have the terminology to describe what happens often.
Published at
2023-10-23 08:37:06Event JSON
{
"id": "caa8b78bb2851748b96f1f84442aece534759e26bf4c3e8685c29d2c9fa97009",
"pubkey": "ee0e813924d081ef9c79af794ef5f2113b6c8051de41bad7ad403611cff01b6c",
"created_at": 1698050226,
"kind": 1,
"tags": [],
"content": "Well, it's happening! Did you ever try to ask some AI chatbot like Bard, OpenAI some question, and get confident and argumented answer you checked afterwards and it turned out it is completly or partly false. That behaviour in AI is called \"AI hallucination\". I'm glad I now have the terminology to describe what happens often.",
"sig": "27bb3fd4278ceeb2e214c78ae26993ce85a0a6b49a1c5b7158075e93f431819c1284ac220bfc0e187c36f198f89a21875af1e375db6c87d103fd459843a8ea1c"
}