Why Nostr? What is Njump?
2024-04-10 00:56:17
in reply to

OceanSlim on Nostr: Well I can help you if you have questions. But running large local LLMs still won't ...

Well I can help you if you have questions. But running large local LLMs still won't be able to achieve what Large Language Models at data enters can deliver. has more experience building a rig specifically for this with 3 2070s if I remember right. He may have something to say on how well that can realistically perform.
Author Public Key
npub1zmc6qyqdfnllhnzzxr5wpepfpnzcf8q6m3jdveflmgruqvd3qa9sjv7f60