Why Nostr? What is Njump?
2024-09-16 12:09:11

S!ayer on Nostr: This has been informative. Thanks all for the suggestions. Time to deep dive. ...

This has been informative. Thanks all for the suggestions. Time to deep dive.
what's the best LLM to run from your local machine without the need for a heavy GPU?

Llama et al. all seem to require a chucky GPU, but surely we're at the stage (3 years later) that we have some local LLMs?
Author Public Key
npub1ehkvx8rdjsrwnf7kkqr8gy42vcwe6vwgqdwrl4juqccp689v8wfqr3mz4s