Why Nostr? What is Njump?
2025-03-20 18:46:46
in reply to

vic on Nostr: $3000 and can run 200B models? Not bad compared to $30000 for an H100. If all you ...

$3000 and can run 200B models? Not bad compared to $30000 for an H100. If all you care about is running the latest, biggest LLMs locally then it's the cheapest option by far. Otherwise you're looking at a Mac Studio.

If it runs on plain old Linux (with the gay Nvidia drivers of course) then that's even better than Digits, which uses DGX OS.

If only there was a truly neutral, uncensored LLM released to the wild. The possibilities would be endless.
Author Public Key
npub1clamz5n660wpqgsudxf5hs08unqcm4trvmwlsahtpshpxa63x72q2safsd