MemeticResearchLaboratories on Nostr: The next big feature in will be vector storage for all content, making search and ...
The next big feature in
https://contex.st will be vector storage for all content, making search and memory context for LLMs very natural.
Alongside this, a prompt feature that allows one to specify data to be included in an LLM chat session
This will lead to asking Agents to query your “Contex.st” to generate reports, statistics and write research documents
We need creative researchers and memelords to use it and hopefully give us suggestions and feedback
Published at
2025-05-30 20:04:00Event JSON
{
"id": "494ca24e215e19934bd0de6f1752afcf1b86582a031b370b84f0785f9d471f09",
"pubkey": "dc426293889c3c285c03c0b77410b8f2eccfaeeb27331df901ec22b594b9ea98",
"created_at": 1748635440,
"kind": 1,
"tags": [],
"content": "The next big feature in https://contex.st will be vector storage for all content, making search and memory context for LLMs very natural.\n\nAlongside this, a prompt feature that allows one to specify data to be included in an LLM chat session\n\nThis will lead to asking Agents to query your “Contex.st” to generate reports, statistics and write research documents\n\nWe need creative researchers and memelords to use it and hopefully give us suggestions and feedback\n\n",
"sig": "fcb9f29c357fae692348add15618f94ef4b22f6eebb8878bb04142774e2ac2075b5796ed066101fe540230378d2ab3221f2b9f9fda77610fca3797fd2eeba6b7"
}