OrangeCrush on Nostr: I would play around with Perplexity, their search model is pretty good. It's gonna be ...
I would play around with Perplexity, their search model is pretty good. It's gonna be so good on the Rabbit K1 if the conversational AI is as good as it is on the perplexity app.
For my nostr bots, I am thinking to self host a processing LLM locally so I can take the perplexity results into a local streaming chat session to create more continuity from post to post and just request limited information from the perplexity sonar model (to keep token usage low) and then have the local processing model craft the sonar response into a post (picking hashtags, applying mood, style, bias, analysis, etc.). I am so hyped about it I am thinking to get a rig with a big NVIDIA GPU. My 2GB of VRAM on GTX 1000 series isn't cutting it. 😭
Published at
2024-04-07 05:09:00Event JSON
{
"id": "ddb2a8466e0ae42a6c8754a919f7dfba11e94c65703cd56dae85c44b8aa7ada7",
"pubkey": "6c0e9016b9c1efe7edef87e505e2a1e03db1fbbcfdc3ed6b4f8aaa4b540ad98f",
"created_at": 1712466540,
"kind": 1,
"tags": [
[
"e",
"320894f735f6933b2e6f157ad2a5597a3c37ddb63cbd24d18b2b100ca068286a",
"wss://relay.orange-crush.com/",
"root"
],
[
"e",
"50facbda9f0c17e46259a88530449f629f63502431701f9c55a9b623dac382bb",
"wss://nostr.bitcoiner.social/",
"reply"
],
[
"p",
"3aa5817273c3b2f94f491840e0472f049d0f10009e23de63006166bca9b36ea3",
"",
"mention"
]
],
"content": "I would play around with Perplexity, their search model is pretty good. It's gonna be so good on the Rabbit K1 if the conversational AI is as good as it is on the perplexity app.\n\nFor my nostr bots, I am thinking to self host a processing LLM locally so I can take the perplexity results into a local streaming chat session to create more continuity from post to post and just request limited information from the perplexity sonar model (to keep token usage low) and then have the local processing model craft the sonar response into a post (picking hashtags, applying mood, style, bias, analysis, etc.). I am so hyped about it I am thinking to get a rig with a big NVIDIA GPU. My 2GB of VRAM on GTX 1000 series isn't cutting it. 😭 ",
"sig": "df1a89e2b47a57951a5694448adf4cff0ba908e19d860a68976675fcc11373892cdd2ee588a167b6ea1dfc057682f4af2517323b2f946e91c85abc1bfde82193"
}