Vajas on Nostr: It kind of is a bit of both. Finding a large model isn’t too hard. I mean Gemini ...
It kind of is a bit of both. Finding a large model isn’t too hard. I mean Gemini 1.5 Pro eats movie scripts and finds the right answer in there or in huge codebases.
I implemented a RAG LLM app that runs locally on my machine with a local model and it scans my source code directories and gives me answers about them. And it was less than 100 lines of Python.
Published at
2024-02-24 15:51:26Event JSON
{
"id": "f7cb0770a4df0dca347a4ee55e97a148284da7389ff05eb2afb4f1cd3fa3a02d",
"pubkey": "2eb4ee16dcd98d55c2782a5b96d2eefebd1eb7f2b4a86bb63e1e530ed7e1d8ec",
"created_at": 1708789886,
"kind": 1,
"tags": [
[
"e",
"1ff81010b2639b28cf5721965ca3327b12b611c2d39fdeefffeff580d96f4a16"
],
[
"e",
"943b49ec11fcf2af0f741dee84bdcc3bfba7877c5c063e45960e2679f110f0e7"
],
[
"p",
"303d576b1254d9e8875594c4dd09e10429e4ee3b40e53fd5c3f8b710b1dc9b1d"
],
[
"p",
"303d576b1254d9e8875594c4dd09e10429e4ee3b40e53fd5c3f8b710b1dc9b1d"
]
],
"content": "It kind of is a bit of both. Finding a large model isn’t too hard. I mean Gemini 1.5 Pro eats movie scripts and finds the right answer in there or in huge codebases.\n\nI implemented a RAG LLM app that runs locally on my machine with a local model and it scans my source code directories and gives me answers about them. And it was less than 100 lines of Python.",
"sig": "bc8cd9f224724a84005fbda19079ba09dba5e13dfcc85ea2e22bd5e6d0fba616befbbb11ac14048653abfcec1f8a8288699d3b69f266cb91a597b7ea0c58f9ac"
}