MachuPikacchu on Nostr: I started working on one but didn’t get far because… life. The idea: you run this ...
I started working on one but didn’t get far because… life.
The idea: you run this service that connects to a set of relays and streams all the notes from your web into Kafka. Then you have a consumer group for processing (e.g. sending to an LLM for classification, labeling, etc). Then publish to another topic and have 2 consumer groups listening: one that streams to websocket connections and another for storage.
I used Kafka so eventually you could have commercial relays doing this for many users. It was going to use langchain-go and Ollama by default but obviously can use ChatGPT or Claude if needed.
Another reason for using Kafka is that you can have so many concurrent processors. For example I planned to generate embeddings for each note and store it in a vector database so users could do search too.
Published at
2024-10-17 15:54:33Event JSON
{
"id": "fb78eab03ede30fbf744f5e679f6ff60e1c07152f64aba57c9c338fcebac53a6",
"pubkey": "1e908fbc1d131c17a87f32069f53f64f45c75f91a2f6d43f8aa6410974da5562",
"created_at": 1729180473,
"kind": 1,
"tags": [
[
"e",
"99c6ea5a55ef34891d38b9f22ba0f521a09a4ce71c7b010b54723b3493d7c347",
"",
"root"
],
[
"p",
"e2ccf7cf20403f3f2a4a55b328f0de3be38558a7d5f33632fdaaefc726c1c8eb"
]
],
"content": "I started working on one but didn’t get far because… life.\n\nThe idea: you run this service that connects to a set of relays and streams all the notes from your web into Kafka. Then you have a consumer group for processing (e.g. sending to an LLM for classification, labeling, etc). Then publish to another topic and have 2 consumer groups listening: one that streams to websocket connections and another for storage.\n\nI used Kafka so eventually you could have commercial relays doing this for many users. It was going to use langchain-go and Ollama by default but obviously can use ChatGPT or Claude if needed.\n\nAnother reason for using Kafka is that you can have so many concurrent processors. For example I planned to generate embeddings for each note and store it in a vector database so users could do search too.",
"sig": "94ac85e9244b1aead4da1e8151d126826ce1dd1f8ec67990f2ad5540302b37557fab7f949bfd3a5e92be0d0f6b0d7a3ccabba16cdaf6ff9f079c39038d1909b2"
}