S!ayer on Nostr: what's the best LLM to run from your local machine without the need for a heavy GPU? ...
what's the best LLM to run from your local machine without the need for a heavy GPU?
Llama et al. all seem to require a chucky GPU, but surely we're at the stage (3 years later) that we have some local LLMs?
Published at
2024-09-15 23:58:48Event JSON
{
"id": "3ace45519383b4c02a690755dce5a353c9ba8c46bdf602645304e0a44cfe7c66",
"pubkey": "cdecc31c6d9406e9a7d6b0067412aa661d9d31c8035c3fd65c06301d1cac3b92",
"created_at": 1726444728,
"kind": 1,
"tags": [],
"content": "what's the best LLM to run from your local machine without the need for a heavy GPU? \n\nLlama et al. all seem to require a chucky GPU, but surely we're at the stage (3 years later) that we have some local LLMs? ",
"sig": "b147b309d8ce26119a923aeeb793a0f818a00b6a5d5c0fddf2b04112f312b7fade8364d901098f4a7744a2329e4c186f61afd1de2c9940b1f966db6e1330f54a"
}