Why Nostr? What is Njump?
2025-04-30 10:38:50

kpr797 on Nostr: How good of a video card do you need to run actually smart LLMs locally? #asknostr ...

How good of a video card do you need to run actually smart LLMs locally? #asknostr #ai #llm
How much video RAM is needed to run a version of the models that are actually smart though? I tried the Deepseek model that fits within 8 GB of video RAM, and it was basically unusable.
Author Public Key
npub1asqr6hh9zqgpnudm33vxa9j54kafz0huzmczq5wrjavxjnmshmxs5rt7pd