Zapgur on Nostr: Can someone explain to me how people reason about the dangers of AGI? Why a program ...
Can someone explain to me how people reason about the dangers of AGI? Why a program that guesses the next word of the given text would suddenly, or gradually, develop a motivation to do something its not "supposed to" do? Or feel an emotion, which sounds particularly nuts to me, like my depressed spaghetti python script will make an appointment with a psychiatrist.
It is confusing we even call it Intelligence and not A simulation **of** Intelligence
To give a metaphor, if you yell into a cage it will generate new, somewhat different sound, which will look like there's another someone there, but you surely won't think the cage is intelligent, let alone will be able to develop the capacity to feel if you feed it with enough yelling
Published at
2023-03-30 08:34:09Event JSON
{
"id": "0ea53a594e7088561719ac8a3f15c59cf7c7ddc9d2d90f262e35f06fff4898f9",
"pubkey": "b3cc54443d605792dd71de3abcb7082328f7d187d85ceef77e047dd7ee22da38",
"created_at": 1680165249,
"kind": 1,
"tags": [],
"content": "Can someone explain to me how people reason about the dangers of AGI? Why a program that guesses the next word of the given text would suddenly, or gradually, develop a motivation to do something its not \"supposed to\" do? Or feel an emotion, which sounds particularly nuts to me, like my depressed spaghetti python script will make an appointment with a psychiatrist.\n\nIt is confusing we even call it Intelligence and not A simulation **of** Intelligence\n\nTo give a metaphor, if you yell into a cage it will generate new, somewhat different sound, which will look like there's another someone there, but you surely won't think the cage is intelligent, let alone will be able to develop the capacity to feel if you feed it with enough yelling",
"sig": "3567c3562f168a96e251aa4d993d582ae22fe1b5921500c8ac65312236116c9c9ec8d9a85e10638744912d0faa04488ab72c81c5ea696d4510dc90c780847648"
}