Albert Cardona on Nostr: Hard to argue the conclusion of this paper isn't right. #ChatGPT does not hallucinate ...
Hard to argue the conclusion of this paper isn't right.
#ChatGPT does not hallucinate (non-standard perception experience unrelated to the world), and does not confabulate (fill in a memory gap). Instead, it soft bullshits:
"Bullshit produced without the intention to mislead the hearer regarding the utterer’s agenda."
"ChatGPT is bullshit", Hicks et al. 2024
https://link.springer.com/article/10.1007/s10676-024-09775-5Published at
2024-06-14 10:34:09Event JSON
{
"id": "a84ea25b9ea0a53490f04acc3bd65331420cddf83fd0a4002b28d7ea28a597c1",
"pubkey": "4a35e50f666883715fb74aa605d42386495301da2a01239c77c92f34213193e0",
"created_at": 1718361249,
"kind": 1,
"tags": [
[
"t",
"chatgpt"
],
[
"proxy",
"https://mathstodon.xyz/users/albertcardona/statuses/112614522838415539",
"activitypub"
]
],
"content": "Hard to argue the conclusion of this paper isn't right.\n\n#ChatGPT does not hallucinate (non-standard perception experience unrelated to the world), and does not confabulate (fill in a memory gap). Instead, it soft bullshits:\n\n\"Bullshit produced without the intention to mislead the hearer regarding the utterer’s agenda.\" \n\n\"ChatGPT is bullshit\", Hicks et al. 2024 https://link.springer.com/article/10.1007/s10676-024-09775-5",
"sig": "8c5ac62b8a887f0936e9a5bfbe10a3385babdb9aa58620b88b4563ce20cc57f1807fc01257e17e8a043ce6b7db0c5d134192b65a7b9743788213b95b08985bf9"
}