quotingHow much video RAM is needed to run a version of the models that are actually smart though? I tried the Deepseek model that fits within 8 GB of video RAM, and it was basically unusable.
nevent1q…jqgd
kpr797 on Nostr: How good of a video card do you need to run actually smart LLMs locally? #asknostr ...
How good of a video card do you need to run actually smart LLMs locally? #asknostr #ai #llm