Why Nostr? What is Njump?
2024-02-24 15:51:26
in reply to

Vajas on Nostr: It kind of is a bit of both. Finding a large model isn’t too hard. I mean Gemini ...

It kind of is a bit of both. Finding a large model isn’t too hard. I mean Gemini 1.5 Pro eats movie scripts and finds the right answer in there or in huge codebases.

I implemented a RAG LLM app that runs locally on my machine with a local model and it scans my source code directories and gives me answers about them. And it was less than 100 lines of Python.
Author Public Key
npub1966wu9kumxx4tsnc9fded5hwl673adljkj5xhd37refsa4lpmrkq88hkfn