Simon Willison on Nostr: If you have 64GB of RAM you may be able to run Llama 3 70B directly on your own ...
Published at
2024-04-22 15:45:30Event JSON
{
"id": "fa7ff3675be548dd9037a90e8a655f18a87c04cc83f6dee251f9236df7775cb9",
"pubkey": "8b0be93ed69c30e9a68159fd384fd8308ce4bbf16c39e840e0803dcb6c08720e",
"created_at": 1713800730,
"kind": 1,
"tags": [
[
"e",
"e5db206d27ac02313834e64786094e2199879dc5c948d32411aeb2664fab366c",
"wss://relay.mostr.pub",
"reply"
],
[
"proxy",
"https://fedi.simonwillison.net/users/simon/statuses/112315644663399204",
"activitypub"
]
],
"content": "If you have 64GB of RAM you may be able to run Llama 3 70B directly on your own machine - I got it working using llamafile, details in the post here: https://simonwillison.net/2024/Apr/22/llama-3/#local-llama-3-70b-instruct-with-llamafile",
"sig": "a62d5a80a2a882f0ee7ef02ca92a7846e81ce5016b1485e92d2c7804738bd4a3392b91a394592aadd1022d459b8e14605c35327daacd6ac970fe97af2e7e2d47"
}