Why Nostr? What is Njump?
2024-10-08 22:52:57
in reply to

John Carlos Baez on Nostr: npub10l889…eath2 - but I didn't say "neural networks research is physics". I say ...

- but I didn't say "neural networks research is physics". I say "it all started with physics", and gave evidence for how Hopfield and Hinton were using ideas from statistical mechanics. Yes, neural network research has grown beyond this now - it's now its own subject. I don't think anyone is claiming otherwise.

Wikipedia on Hopfield networks:

"The Sherrington–Kirkpatrick model of spin glass, published in 1975, is the Hopfield network with random initialization. Sherrington and Kirkpatrick found that it is highly likely for the energy function of the SK model to have many local minima. In the 1982 paper, Hopfield applied this recently developed theory to study the Hopfield network with binary activation functions. In a 1984 paper he extended this to continuous activation functions. It became a standard model for the study of neural networks through statistical mechanics."

Wikipedia on Boltzmann machines:

"They are named after the Boltzmann distribution in statistical mechanics, which is used in their sampling function. They were heavily popularized and promoted by Geoffrey Hinton, Terry Sejnowski and Yann LeCun in cognitive sciences communities, particularly in machine learning, as part of "energy-based models" (EBM), because Hamiltonians of spin glasses as energy are used as a starting point to define the learning task."
Author Public Key
npub17u6xav5rjq4d48fpcyy6j05rz2xelp7clnl8ptvpnval9tvmectqp8pd6m