Why Nostr? What is Njump?
2024-06-26 05:56:11

Gerry McGovern on Nostr: "In order to hallucinate, one must have some awareness or regard for the truth; LLMs, ...

"In order to hallucinate, one must have some awareness or regard for the truth; LLMs, by contrast, work with probabilities, not binary correct/incorrect judgments. Based on a huge map of words created by processing huge amounts of text, LLMs decide which words would most likely follow from the words used in a prompt. They’re inherently more concerned with sounding truthy than delivering a factually correct response, the researchers conclude."

https://www.fastcompany.com/91145865/why-experts-are-using-the-word-bullshit-to-describe-ais-flaws
Author Public Key
npub1tev3swuf9vzq9sup6ttkwgcae9d7fw05ca0tn36fvw56sdjx7y3s3ts6tq