Why Nostr? What is Njump?
2025-03-29 12:20:29
in reply to

Rm -rf on Nostr: I came to the same conclusion a while ago. A good way to think about llm is as a ...

I came to the same conclusion a while ago. A good way to think about llm is as a lossy archive of text data. You enter a text input as a path, and it extracts data based on that path. The smaller the model, the lerger the loss in data. Too large models will have paths that lead nowhere
Author Public Key
npub1z80tdpw4rxs7rukrpjkkhuct0akdugp5gr2vauqsv5s3emz05lwq0ary6k