Why Nostr? What is Njump?
2023-08-27 07:03:00
in reply to

Ocean on Nostr: Raul007 I’m running llama2 and code lama locally on my laptop. Lot of fun. I think ...

I’m running llama2 and code lama locally on my laptop. Lot of fun. I think only the 7b models. Wonder if I could run 13b I have 24 gb ram.

Really want to be able to feed it docs pdfs etc. currently only runn inch in command line via Ollama
Author Public Key
npub1y0kt3nttqhre2utsglce4pzyma67lp3xumldwzkkfrdkpjjht6qqnlyrh7