Why Nostr? What is Njump?
2024-08-18 07:13:57
in reply to

juraj on Nostr: Ollama runs your model locally This project is a code different from Ollama, it ...

Ollama runs your model locally

This project is a code different from Ollama, it pretends it is ollama but uses models through Venice.

So any app that would talk to local model through ollama can use Venice instead.
Author Public Key
npub1m2mvvpjugwdehtaskrcl7ksvdqnnhnjur9v6g9v266nss504q7mqvlr8p9