Why Nostr? What is Njump?
2024-09-13 14:25:04
in reply to

Vyram Kraven on Nostr: I know someone who self hosts an opem llm using ollama. A self hosted ai is always ...

I know someone who self hosts an opem llm using ollama. A self hosted ai is always better then using gpt, claude, or any others. Your info is tracked & logged when using those also they are designed to censor or answer certain questions a specific way.

With an open source model you can get a unfiltered answer, adjust through ollama the weight of flexibility in response. (precise responding or a little lose to be flexible)
You can folder conversations see all the chats for anyone if you run it & make it public for anyone to use, you can run multiple models & adjust them if they are off in response to teach it correction.

The drawback is bigger models require a ton of RAM & GPU memory if you can't handle it your computer will freeze when it's trying to generate a response. I really recommend a 128GB RAM & a very high quality gpu. If you go public with account creation & are planning to make a public service you might want to look into multiple gpu servers.

Multiple users using it at the same time you will hear your gpu going loud. It's worth it in the end to host your own because in the end the data is yours. What is needed is more developers so it can be optimized to work with less power.
Author Public Key
npub1k06ctulq8rcaen067tt6z3y6gxrqtsuje5em5lcn009zfpp0t7gqf0zyu9