Why Nostr? What is Njump?
2024-02-20 01:40:58
in reply to

Doc Orange on Nostr: Wow this is incredible! It'd be cool if LM Studio added support for this. You could ...

Wow this is incredible! It'd be cool if LM Studio added support for this. You could rely on cloud-based inference when you have internet and want lightning-fast responses, but fallback to local inference using the same model if you're offline or want to ask a question privately.
Author Public Key
npub1jgnwmgfxcch992rkaqm5sk740vzsnpnmkrrnrm3pxu3uv6smmc8qhkqcgk