Why Nostr? What is Njump?
2025-03-25 18:17:31
in reply to

Daniel Wigton on Nostr: Which model? So far llama3.3 is my only tolerable local model. But it is throttled by ...

Which model? So far llama3.3 is my only tolerable local model. But it is throttled by the speed my ram can feed the remaining 18GB to my CPU. So mostly I talk to my CPU I guess even though the GPU is doing 4/7 of the work.
Author Public Key
npub1w4jkwspqn9svwnlrw0nfg0u2yx4cj6yfmp53ya4xp7r24k7gly4qaq30zp