juraj on Nostr: Ollama runs your model locally This project is a code different from Ollama, it ...
Ollama runs your model locally
This project is a code different from Ollama, it pretends it is ollama but uses models through Venice.
So any app that would talk to local model through ollama can use Venice instead.
Published at
2024-08-18 07:13:57Event JSON
{
"id": "76f145319f9c3722812b28d6c19cee22e8827c7f6e8d87aacc00b59047a68666",
"pubkey": "dab6c6065c439b9bafb0b0f1ff5a0c68273bce5c1959a4158ad6a70851f507b6",
"created_at": 1723965237,
"kind": 1,
"tags": [
[
"p",
"dab6c6065c439b9bafb0b0f1ff5a0c68273bce5c1959a4158ad6a70851f507b6"
],
[
"p",
"f5bcf31413c3a5334f2abb5e9e66e4691bd92b6e29a4068e8e9caf550507683d"
],
[
"e",
"b06bb096588086e78c7aa2ed98c86fdd6717452c61888c1cf6a09d0e3db245d7",
"",
"root"
],
[
"e",
"6c31140610299f81c141fa0d440b48c9eba9860eb2d349f2514320a6a4bb11b5",
"wss://relay.kisiel.net.pl",
"reply"
]
],
"content": "Ollama runs your model locally\n\nThis project is a code different from Ollama, it pretends it is ollama but uses models through Venice.\n\nSo any app that would talk to local model through ollama can use Venice instead.",
"sig": "25afe274315df1ef05319395fdecaa1906b77837293059fdfd39168c64ba969fc3ee4deebec08faac1b4681fb00dd0671c72328675e737c6c38bba5534425492"
}