Why Nostr? What is Njump?
2024-01-19 18:16:36
in reply to

⚗️alchemist⚗️ on Nostr: nemesis :blackhole: The question on my mind is how do we know that "all of math" ...

The question on my mind is how do we know that "all of math" won't turn out to be a narrow problem in the same sense? Naively, it seems like the main question is whether the higher-level abstractions which guide mathematical thought can be implicitly encoded in some general-purpose neural architecture or whether the ML system needs to be able to encode them explicitly. Maybe every neural architecture has a finite capacity to store higher-level abstractions.

(To be explicit, the higher-level abstractions in geometry would be concepts like orthocenter or nine-point circle, etc. which allow you to reason about the problem in a compressed state. From their paper, it sounds like some of these abstractions were hard-coded (like "construct an excircle"), but I can believe that the LLM was also learning its own abstractions in some unintelligible form.)
Author Public Key
npub1pt6l3a97fvywrxdlr7j0q8j2klwntng35c40cuhj2xmsxmz696uqfr6mf6