Zorro Notorious MEB š” on Nostr: Howdy Fediverse: I gotta research question / curious little bug in my ear. Given a ...
Howdy Fediverse: I gotta research question / curious little bug in my ear.
Given a trained large language model, can we measure / estimate the ratio of the number of bits of training data over the number of bits in the LLM's "knowledge base"? If so, what are some numbers for the most popular LLMs?
#AskFedi #LLM #AI
Published at
2024-06-12 04:04:27Event JSON
{
"id": "313a12559b973185b3f4813e7a9a64c3b0719934659f133eb91f6d96bb6196e6",
"pubkey": "a43ece8d2ba8675d19792ba3b87a10a497953e7962c4cc9e63cf37606602dcd6",
"created_at": 1718165067,
"kind": 1,
"tags": [
[
"t",
"askfedi"
],
[
"t",
"llm"
],
[
"t",
"ai"
],
[
"proxy",
"https://universeodon.com/users/AlgoCompSynth/statuses/112601665832163615",
"activitypub"
]
],
"content": "Howdy Fediverse: I gotta research question / curious little bug in my ear.\n\nGiven a trained large language model, can we measure / estimate the ratio of the number of bits of training data over the number of bits in the LLM's \"knowledge base\"? If so, what are some numbers for the most popular LLMs?\n\n#AskFedi #LLM #AI",
"sig": "99826b4e465b79243288eea8fda1bdfc2c342aa2adfcdd60efee369863c2c1ce60d1f5d2b44c441301286ea8329b4c2cf37d4fd6bd39f063342a3b1a370c1de9"
}