Darth Hideout đłď¸âđ on Nostr: When I was wee, computers were the size of barns & yet there were still vending ...
When I was wee, computers were the size of barns & yet there were still vending machines. Before bill scanners & card readers, these mechanical contrivances could achieve something humans require intelligence, training & practice for: They could sell merchandise & even make change. They mainly distinguished direct physical characteristics of coins such as weight, diameter, thickness, presence of milling, etc. But this was enough for a machine to do a job that required a human to think.
At no time did any such device ever think.
The same goes for every #computer program ever written. Computer programs, like vending machines, are state machines, cleverly contrived (at most) to respond to different real-time conditions. We often lazily say that a computer makes a decision, eg based on your input, it âdecidesâ what to do. This is false. The human who wrote the program made the decision. The program is responding to its input as it was designed, by a human, to do. Itâs never the machine doing the thinking or deciding. That human process of thought & design may be faulty & error-prone; the operation of the machine may surprise the mechanic. This doesnât mean the machine is thinking. It means the mechanic is not thinking enough.
And of course, the same goes for modern #AI. The machines are far more elaborate and contrived to use (not âunderstandâ) natural language; to obtain (not âdiscoverâ) source material on the internet; to acquire state (not âlearnâ) to extend the capabilities offered. Lazy terminology is common & inspires fantasies.
Honestly, I watch my Roomba & I use the same words to myself: It âcanât figure outâ how to get around a certain corner or it âdecidedâ to take another pass down the hallway. Itâs hard to avoid, but it doesnât thereby become true.
Published at
2024-11-26 04:49:31Event JSON
{
"id": "5ae3fb89f156723bd75409f1c99a49eb3e41491fa1e3557db4b41cf5834be9c6",
"pubkey": "b522a740af8ab68dad7e1712203ede8fd0e04f8ae844276e25fc8ed45f56c303",
"created_at": 1732596571,
"kind": 1,
"tags": [
[
"t",
"Computer"
],
[
"t",
"AI"
],
[
"proxy",
"https://c.im/users/darth_hideout/statuses/113547448933026532",
"activitypub"
]
],
"content": "When I was wee, computers were the size of barns \u0026 yet there were still vending machines. Before bill scanners \u0026 card readers, these mechanical contrivances could achieve something humans require intelligence, training \u0026 practice for: They could sell merchandise \u0026 even make change. They mainly distinguished direct physical characteristics of coins such as weight, diameter, thickness, presence of milling, etc. But this was enough for a machine to do a job that required a human to think.\n\nAt no time did any such device ever think.\n\nThe same goes for every #computer program ever written. Computer programs, like vending machines, are state machines, cleverly contrived (at most) to respond to different real-time conditions. We often lazily say that a computer makes a decision, eg based on your input, it âdecidesâ what to do. This is false. The human who wrote the program made the decision. The program is responding to its input as it was designed, by a human, to do. Itâs never the machine doing the thinking or deciding. That human process of thought \u0026 design may be faulty \u0026 error-prone; the operation of the machine may surprise the mechanic. This doesnât mean the machine is thinking. It means the mechanic is not thinking enough.\n \nAnd of course, the same goes for modern #AI. The machines are far more elaborate and contrived to use (not âunderstandâ) natural language; to obtain (not âdiscoverâ) source material on the internet; to acquire state (not âlearnâ) to extend the capabilities offered. Lazy terminology is common \u0026 inspires fantasies.\n\nHonestly, I watch my Roomba \u0026 I use the same words to myself: It âcanât figure outâ how to get around a certain corner or it âdecidedâ to take another pass down the hallway. Itâs hard to avoid, but it doesnât thereby become true.",
"sig": "95408f892f0897179f1827f0268ea53288cb868c19c75b69125037d4b3efc08a1fc35121066762a4b74172b5740da4a118ad6df4191c93ac36f9c011b5e02d70"
}