mark tyler on Nostr: Re: AI does not truly understand concepts such as addition, subtraction, or text ...
Re: AI does not truly understand concepts such as addition, subtraction, or text summarization. Rather, it produces a mathematical estimate to satisfy our desired results.
If it can form internal representations of these concepts, and it definitely does in some cases, then it will predict text better. So I disagree with this sentence. It is motivated to internalize concepts, and it does. Bigger models will more so. See that video I linked for more on that, especially the unicorn drawing section
Published at
2023-04-19 01:02:56Event JSON
{
"id": "e8bfec3dfb58b7672d869ae31eb54abf1e9ec17d9a6f7723b5a5e4da7e1bfa6a",
"pubkey": "9baed03137d214b3e833059a93eb71cf4e5c6b3225ff7cd1057595f606088434",
"created_at": 1681866176,
"kind": 1,
"tags": [
[
"e",
"494bc5c6d8516f1bb712b0dcbe5a43a11724ff0f4ce44ba45fa87bd33ed8192d"
],
[
"e",
"f149ead009251b3ac548426a699bf4f1ca6b34b6fd17da0436056707bec35e98"
],
[
"p",
"1bc70a0148b3f316da33fe3c89f23e3e71ac4ff998027ec712b905cd24f6a411"
],
[
"p",
"07eced8b63b883cedbd8520bdb3303bf9c2b37c2c7921ca5c59f64e0f79ad2a6"
]
],
"content": "Re: AI does not truly understand concepts such as addition, subtraction, or text summarization. Rather, it produces a mathematical estimate to satisfy our desired results.\n\nIf it can form internal representations of these concepts, and it definitely does in some cases, then it will predict text better. So I disagree with this sentence. It is motivated to internalize concepts, and it does. Bigger models will more so. See that video I linked for more on that, especially the unicorn drawing section ",
"sig": "a87a01f5999f75ff4689c72cf89b11aec64d5f60bc7ac00974a7ced8a40f5670c0e4f98e4aad4c142656fa3c35bcd2b5061171d3140366c2eb5e7744e23475a3"
}