T-posing 3D Rotating Puppy on Nostr: People who use llm for advice about technical fields, do you prefer one single ...
People who use llm for advice about technical fields, do you prefer one single conversation with a lot of context, or to remake a conversation from scratch every time?
I don't _rely_ on them but when i try them out I'm definitely in camp #2
(If you don't know the internals: LLMs have limited _context windows_. When close to the limit, chatbots will generally replace the conversation history with a summary of it, so past a certain point, the computer no longer has access to your chat history, but on the other hand, its entire context will be on topic)
Published at
2024-08-15 05:55:04Event JSON
{
"id": "f79c8b5a4f93863a01c73e7331cee5aaeb75f684a9b6673519726412b500e525",
"pubkey": "91127e2523af2b8487190747a1f54c527329b67cbac3606283352b0ee005c42f",
"created_at": 1723701304,
"kind": 1,
"tags": [
[
"proxy",
"https://fedi.aria.dog/objects/cc7e33aa-2ad8-4bb8-86d6-69724f5ba686",
"activitypub"
]
],
"content": "People who use llm for advice about technical fields, do you prefer one single conversation with a lot of context, or to remake a conversation from scratch every time? \n\nI don't _rely_ on them but when i try them out I'm definitely in camp #2\n\n(If you don't know the internals: LLMs have limited _context windows_. When close to the limit, chatbots will generally replace the conversation history with a summary of it, so past a certain point, the computer no longer has access to your chat history, but on the other hand, its entire context will be on topic)",
"sig": "eb21beb3273729d8b7f3f3c8be693485b4001ff682db1ab95730fed1f3b4603a44c0984b98a74fab9307cf1c0f1fcdf726d9fa92d6f6dd55b817c365ac78372f"
}