jb55 on Nostr: Llama.cpp runs fine on m2
Llama.cpp runs fine on m2
Published at
2024-09-16 00:00:33Event JSON
{
"id": "3d6a83ffafc021fe0e6a095f21f29ebae4642218ff4a7c9fee07889fa22498c9",
"pubkey": "32e1827635450ebb3c5a7d12c1f8e7b2b514439ac10a67eef3d9fd9c5c68e245",
"created_at": 1726444833,
"kind": 1,
"tags": [
[
"e",
"3ace45519383b4c02a690755dce5a353c9ba8c46bdf602645304e0a44cfe7c66",
"",
"root"
],
[
"p",
"cdecc31c6d9406e9a7d6b0067412aa661d9d31c8035c3fd65c06301d1cac3b92"
]
],
"content": "Llama.cpp runs fine on m2",
"sig": "41e7f09dc3fa143fed44a1697f65246b849c6877dfceb7dda14a983fad5c85fbbc40684907c701c1ff7fb99fdd8844c0dc86fc014357ff4c26731d2eb61532dc"
}