Tovarich EmmyNoether on Nostr: This does not surprise me. Generative language models are shonky. Watch the Royal ...
This does not surprise me. Generative language models are shonky.
Watch the Royal Society Christmas Lectures if it's still on i-player. I got to see the guy who gave it give a "grown up" version as a talk.
AI is not *intelligent.* It's better thought of as a hugely souped-up version of the predictive text on your phone. Based on the prior probabilities of "what will follow word A (which in turn follows B which follows C... etc)" it makes a weighted guess (based on its training data) as to what the next word will be.
There is no cognition, no checking for internal consistency, no fact checking, no sources referenced - because it's not doing any of the things a human does when they summarise a text. It's just giving probabilistically weighted guesses as to what the next word in a sentence will be, given the preceding words. It's sophisticated enough that the depth it goes back into "past words" will throw up some relatively high weightings if the string of past words includes technical terms or proper names, so it's likely to get, say, a request for a biography of Henry VIII roughly right. But a biography of a relatively unknown person who isn't widely discussed may be wildly off the mark.
Published at
2024-09-04 12:32:10Event JSON
{
"id": "2334eec4085d50ae2152f4356eda08bc4d316323354bd4105d81e4242a16fb82",
"pubkey": "5ee6f525d4167be88b8a6ae6e1470765346268d0d01e8fb3812465962459da83",
"created_at": 1725453130,
"kind": 1,
"tags": [
[
"p",
"feed01a21ecd1d440a914a0113848f6ba244c9a4e0d5363d88f0e01c67501d7e"
],
[
"e",
"e7997b2962bafdb633a2cce083220195daa37a58c7ef3e193bbb19918d72fe61",
"",
"root",
"feed01a21ecd1d440a914a0113848f6ba244c9a4e0d5363d88f0e01c67501d7e"
],
[
"proxy",
"https://spinster.xyz/objects/0a0e3ded-7a92-467e-9456-29e94346cf5d",
"activitypub"
],
[
"L",
"pink.momostr"
],
[
"l",
"pink.momostr.activitypub:https://spinster.xyz/objects/0a0e3ded-7a92-467e-9456-29e94346cf5d",
"pink.momostr"
],
[
"-"
]
],
"content": "This does not surprise me. Generative language models are shonky.\n\nWatch the Royal Society Christmas Lectures if it's still on i-player. I got to see the guy who gave it give a \"grown up\" version as a talk.\n\nAI is not *intelligent.* It's better thought of as a hugely souped-up version of the predictive text on your phone. Based on the prior probabilities of \"what will follow word A (which in turn follows B which follows C... etc)\" it makes a weighted guess (based on its training data) as to what the next word will be.\n\nThere is no cognition, no checking for internal consistency, no fact checking, no sources referenced - because it's not doing any of the things a human does when they summarise a text. It's just giving probabilistically weighted guesses as to what the next word in a sentence will be, given the preceding words. It's sophisticated enough that the depth it goes back into \"past words\" will throw up some relatively high weightings if the string of past words includes technical terms or proper names, so it's likely to get, say, a request for a biography of Henry VIII roughly right. But a biography of a relatively unknown person who isn't widely discussed may be wildly off the mark.",
"sig": "34aaa6c101e9161c38a76e83f1b4d1b29b893550f0037be37cdf1568a78f92e252c11c23b22c9d1166a66f2e2ab695ff461eaf554279823a75435f3e477d65c2"
}