Why Nostr? What is Njump?
2024-09-20 14:03:31
in reply to

someone on Nostr: there are 3 ways for it to remember. 1. context. your chat stays in context window of ...

there are 3 ways for it to remember.

1. context. your chat stays in context window of an LLM. this is like 128k tokens for many open source models. this is equivalent to rougly 500k words.

2. RAG. when context window is full it has to store somewhere. this is like longer term storage of your words. in this phase it is still not learned by the LLM but it is like written down in a notebook. the LLM searches the notebook to answer each time you ask a question.

3. actual training. this is like reading the contents of the notebook and storing in brain. with connections between neurons. each synapse brain is like a cell of a matrix in computer.. in this phase some things are ingrained as knowledge, but some things are forgotten from the notebook. it is not complete memorization of the notebook. but more like reading a notebook and remembering important things from it. if you read it more times you remember more. kind of like humans learn.

Author Public Key
npub1nlk894teh248w2heuu0x8z6jjg2hyxkwdc8cxgrjtm9lnamlskcsghjm9c