Tom Morris on Nostr: This morning, I asked a popular LLM a question about something that requires a little ...
This morning, I asked a popular LLM a question about something that requires a little bit of expertise.
It precisely located the source of the information necessary to answer it... then provided paragraphs of wholly incorrect conclusions based on the correct source.
Here's a bold idea: you could have a system that gives you the source of the answer without the regurgitated incorrectness. Then rely on the human to read the original text and engage their brain.
Maybe call it a search engine.
Published at
2024-08-28 07:55:47Event JSON
{
"id": "e299eb134b00131ca1bb1ffbc9920334dd3c2a21fe32daff42899471f9859166",
"pubkey": "3fb03273ccfb93b475b4da1adbc77095e599011eb4cd3d3eb32200cd3ea04668",
"created_at": 1724831747,
"kind": 1,
"tags": [
[
"proxy",
"https://mastodon.social/users/tommorris/statuses/113038573395285334",
"activitypub"
]
],
"content": "This morning, I asked a popular LLM a question about something that requires a little bit of expertise.\n\nIt precisely located the source of the information necessary to answer it... then provided paragraphs of wholly incorrect conclusions based on the correct source.\n\nHere's a bold idea: you could have a system that gives you the source of the answer without the regurgitated incorrectness. Then rely on the human to read the original text and engage their brain.\n\nMaybe call it a search engine.",
"sig": "c066b4d624fc72b99120dc93b5eb5eff84b275fda2547f0ff8cac31d4b41dbb48d7fe6ab04e499340922289490506fcec832a2352c46acdeaab117fe4113089d"
}