Tom Morris on Nostr: This morning, I asked a popular LLM a question about something that requires a little ...
This morning, I asked a popular LLM a question about something that requires a little bit of expertise.
It precisely located the source of the information necessary to answer it... then provided paragraphs of wholly incorrect conclusions based on the correct source.
Here's a bold idea: you could have a system that gives you the source of the answer without the regurgitated incorrectness. Then rely on the human to read the original text and engage their brain.
Maybe call it a search engine.
Published at
2024-08-28 07:55:47Event JSON
{
"id": "3f33c936eae159fd1061b88f87dc2da641a96c9a096bcc1db87f2cd97eff58e5",
"pubkey": "e004c205d6f2fbe6dfeda0bd1c8b20ec28d4c3814c1c79387abf6b7047638900",
"created_at": 1724831747,
"kind": 1,
"tags": [
[
"proxy",
"https://mastodon.social/@tommorris/113038573395285334",
"web"
],
[
"proxy",
"https://mastodon.social/users/tommorris/statuses/113038573395285334",
"activitypub"
],
[
"L",
"pink.momostr"
],
[
"l",
"pink.momostr.activitypub:https://mastodon.social/users/tommorris/statuses/113038573395285334",
"pink.momostr"
],
[
"-"
]
],
"content": "This morning, I asked a popular LLM a question about something that requires a little bit of expertise.\n\nIt precisely located the source of the information necessary to answer it... then provided paragraphs of wholly incorrect conclusions based on the correct source.\n\nHere's a bold idea: you could have a system that gives you the source of the answer without the regurgitated incorrectness. Then rely on the human to read the original text and engage their brain.\n\nMaybe call it a search engine.",
"sig": "540123d10d44d49af2cf5d78d236211818db2ab40f41850986bc139ced93ed359ee1435e7345df4d2f6b3c4858e796e731948ea314901dad3bcc21111889e41a"
}