James Valleroy on Nostr: Twice now I've seen a coworker ask an LLM about an obscure technical topics, and get ...
Twice now I've seen a coworker ask an LLM about an obscure technical topics, and get an answer that was both completely believable, and wrong. But because they did not have any previously knowledge on the topics, they believed it, until I gave them the correct information.
At some point, the misinformation being spread by LLMs will become a business risk.
Published at
2025-03-15 15:14:45Event JSON
{
"id": "195b84486ccbfb15549115479fdc984d93376ff0fbca8614cd0c5ee860a6d1cf",
"pubkey": "383b85cf4c05b1bca08ad55e46ea6d96a639ef97d6d2bf0ae42da6c201df733e",
"created_at": 1742051685,
"kind": 1,
"tags": [
[
"proxy",
"https://fosstodon.org/users/jvalleroy/statuses/114167099243392855",
"activitypub"
]
],
"content": "Twice now I've seen a coworker ask an LLM about an obscure technical topics, and get an answer that was both completely believable, and wrong. But because they did not have any previously knowledge on the topics, they believed it, until I gave them the correct information.\n\nAt some point, the misinformation being spread by LLMs will become a business risk.",
"sig": "c8c960533e5224aacc6a4315721ea93ddec323782a40e69fd42bb2e38aa1bac4735f5bec928279fc5ebe72772aad223a7f83cdd996854f4f2340f5f33cb5c028"
}