iefan 🕊️ on Nostr: Meta released MobileLLM models in 125M, 350M, 600M, and 1B parameter sizes. their 1B ...
Published at
2024-10-31 11:54:41Event JSON
{
"id": "6160e31c7c72d2cee1ac7d59d2f7f6ad7081668cf950007efdecf0471b401b28",
"pubkey": "c6f7077f1699d50cf92a9652bfebffac05fc6842b9ee391089d959b8ad5d48fd",
"created_at": 1730375681,
"kind": 1,
"tags": [
[
"r",
"https://huggingface.co/collections/facebook/mobilellm-6722be18cb86c20ebe113e95"
]
],
"content": "Meta released MobileLLM models in 125M, 350M, 600M, and 1B parameter sizes.\n\ntheir 1B Llama model performs well on phones, even mid-range devices. These one are specifically designed and optimized for mobile hardware.\n\nWorks with llama.c\nhttps://huggingface.co/collections/facebook/mobilellm-6722be18cb86c20ebe113e95",
"sig": "a63f7849c6f36393e7f1f524df571254ac61965d032dd3557524cf6a669c52c8630e52ba8d14d3d2847e63f6d968e2d42364e68d116aabbd3652cd5236d1884f"
}