Why Nostr? What is Njump?
2024-09-15 16:03:35
in reply to

iefan 🕊️ on Nostr: Honestly speaking, we don't really have any good solutions right now. It is super ...

Honestly speaking, we don't really have any good solutions right now. It is super centralized, with virtually no privacy.

There is hope that Chromium browsers might start supporting local LLM models.

Chrome ai docs: https://dev.to/grahamthedev/windowai-running-ai-locally-from-devtools-202j)

Meanwhile, I have built a web app using WebLLM and WebGPU, where you can run any LLM model locally in your browser without any performance compromises. (use pc, for larger models)

https://nostr-local-ai.vercel.app/

Right now, I am just waiting for the right technology.
Author Public Key
npub1cmmswlckn82se7f2jeftl6ll4szlc6zzh8hrjyyfm9vm3t2afr7svqlr6f