Why Nostr? What is Njump?
2023-08-24 04:46:41

lhl on Nostr: #llm Yesterday, something incredibly cool was released. The first (afaik) open LoRA ...

#llm Yesterday, something incredibly cool was released. The first (afaik) open LoRA MoE proof of concept - it lets you run 6 llama2-7B experts on a single 24GB GPU. Here’s my setup notes: https://llm-tracker.info/books/howto-guides/page/airoboros-lmoe
Author Public Key
npub165dfl8vze94tcyp0zvqkg966qj6jlgdrka7t6d2s97mvrvjtwtesapy85a