Fi, infosec-aspected on Nostr: So, A, kind of an interesting precedent that a chatbot can incur liability for a ...
Published at
2024-02-15 02:17:14Event JSON
{
"id": "fdafb56bc109902d7ee94ed8f711b4cd85d9a17995b904084d5637cefeac3112",
"pubkey": "25fc0b9f1e98512cc1685b15113098bd5c75558d9ae42c3fa66352f9c141405e",
"created_at": 1707963434,
"kind": 1,
"tags": [
[
"proxy",
"https://infosec.exchange/users/munin/statuses/111933091631290402",
"activitypub"
]
],
"content": "https://bc.ctvnews.ca/air-canada-s-chatbot-gave-a-b-c-man-the-wrong-information-now-the-airline-has-to-pay-for-the-mistake-1.6769454\n\nSo, A, kind of an interesting precedent that a chatbot can incur liability for a company - but consider what this suggests for the use of LLMs that are well-known at this point to 'hallucinate' inaccurate information.",
"sig": "648bf97d32a698d0851929101028f266c1b21e139e3fd10c6e03b88868008f36dd68dcb6c44849a42861d2ec45c5dc54f9519c4749e0ce24751e7fa026f2407e"
}