Gerry McGovern on Nostr: "In order to hallucinate, one must have some awareness or regard for the truth; LLMs, ...
"In order to hallucinate, one must have some awareness or regard for the truth; LLMs, by contrast, work with probabilities, not binary correct/incorrect judgments. Based on a huge map of words created by processing huge amounts of text, LLMs decide which words would most likely follow from the words used in a prompt. They’re inherently more concerned with sounding truthy than delivering a factually correct response, the researchers conclude."
https://www.fastcompany.com/91145865/why-experts-are-using-the-word-bullshit-to-describe-ais-flawsPublished at
2024-06-26 05:56:11Event JSON
{
"id": "4639b2a39f171750cdab7b71cc5d4632216f3f5e1df9a4686d2a318794dd2cf3",
"pubkey": "5e59183b892b0402c381d2d767231dc95be4b9f4c75eb9c74963a9a83646f123",
"created_at": 1719381371,
"kind": 1,
"tags": [
[
"proxy",
"https://mastodon.green/users/gerrymcgovern/statuses/112681377557764953",
"activitypub"
]
],
"content": "\"In order to hallucinate, one must have some awareness or regard for the truth; LLMs, by contrast, work with probabilities, not binary correct/incorrect judgments. Based on a huge map of words created by processing huge amounts of text, LLMs decide which words would most likely follow from the words used in a prompt. They’re inherently more concerned with sounding truthy than delivering a factually correct response, the researchers conclude.\"\n\nhttps://www.fastcompany.com/91145865/why-experts-are-using-the-word-bullshit-to-describe-ais-flaws",
"sig": "d93e18cdb3694f27075aa55f865ef2959c3da11f9dd8c7e0e0fe1e19abc1fb442b69ac8e6b4a98f1df2b79cbb05884426f7e9f364abd7cf4b79bac7b30cb5d13"
}