Alan Siefert on Nostr: It’s a problem with the transformer-based architecture. They are basically ...
It’s a problem with the transformer-based architecture. They are basically feed-forward networks where the “neurons” all face the same way, input layer -> inner layers -> output layer. They need a new design for more complex reasoning and eventual consciousness. Maybe liquid neural networks? Not sure what it’s going to be.
Published at
2024-09-07 20:02:56Event JSON
{
"id": "b5c3cd62abe604770b90d3ea5e2ad6d9cfe0f31582a1ba1ddc389b032d07c5ef",
"pubkey": "f26606e331f128325ec99cd38782edbec134a5f911ea9fada7adce2ca9015b16",
"created_at": 1725739376,
"kind": 1,
"tags": [
[
"e",
"5130e929be23692fab8cd602e4af49395a09d8efbf57f1060d7aab2c6d0310cc",
"wss://nostr-pub.wellorder.net",
"root"
],
[
"p",
"c6f7077f1699d50cf92a9652bfebffac05fc6842b9ee391089d959b8ad5d48fd"
]
],
"content": "It’s a problem with the transformer-based architecture. They are basically feed-forward networks where the “neurons” all face the same way, input layer -\u003e inner layers -\u003e output layer. They need a new design for more complex reasoning and eventual consciousness. Maybe liquid neural networks? Not sure what it’s going to be.",
"sig": "49ecc2a8e77897750621b7a598ce32e681329411ed5b7a68ff61912d5d883474dee25e418cbfb716a6541c81548773335d11a6f856d5f62848177ac217824a5c"
}