GeePawHill on Nostr: One of the key facts here keeps getting sidestepped by a mixture of scam marketing ...
One of the key facts here keeps getting sidestepped by a mixture of scam marketing and common language usage out there.
LLMs don't *sometimes* make shit up, they *always* make shit up.
That's what an LLM *is*: a piece of software that makes up plausible sounding shit.
What's impressive about this is the extent of improvement in the plausibility.
What's horrifying about it is the extent to which so many people don't care to distinguish between plausibility and correctness.
Published at
2024-04-02 19:05:04Event JSON
{
"id": "ea9298621ba3e56a77767c424a9dd25ab570eefb177820bf91c25596e36eec5e",
"pubkey": "4301567baf22ced03484f7faf11dbe0309cc3d4fdbbe9fd724b629cca81e1041",
"created_at": 1712084704,
"kind": 1,
"tags": [
[
"proxy",
"https://mastodon.social/users/GeePawHill/statuses/112203183163246574",
"activitypub"
]
],
"content": "One of the key facts here keeps getting sidestepped by a mixture of scam marketing and common language usage out there. \n\nLLMs don't *sometimes* make shit up, they *always* make shit up.\n\nThat's what an LLM *is*: a piece of software that makes up plausible sounding shit.\n\nWhat's impressive about this is the extent of improvement in the plausibility.\n\nWhat's horrifying about it is the extent to which so many people don't care to distinguish between plausibility and correctness.",
"sig": "b8430b0964b63028a7cfc189e0d7801cc3c6295a8ac8dd06c0d4391a989e539b139f603189b042eaead3b08ed4351774f37e3b4a026c1ccdf4f4e562cf39260d"
}