Zeugs on Nostr: npub14ujlk…p4dcp do you you need about a few hundred gb of GPU ram if you want to ...
npub14ujlkefgqgaejvf2n4ry8203x432psn4maqp2e6xrf3p0gs7fj4sqp4dcp (npub14uj…4dcp) do you you need about a few hundred gb of GPU ram if you want to use a single instance of an state of the art LLM Falcon 180b needs 720GB graphics memory. That's expensive. Even if you run it at fp16 it's 360 GB. And that's just a single instance(I have no idea how much clients you can supply with a single instance). You definitely need to talk to sales if you want some instances of cloud hardware that can handle such big lllm's.
Published at
2023-11-18 15:50:44Event JSON
{
"id": "b2b8407c14574f0cc02d8ed1cdde20c913c2d337da939fd692dc4b7c3ac3cb8e",
"pubkey": "494ba09bca593ee3fbddefee4872c79b17b2c061eabba30753d2624623bed345",
"created_at": 1700322644,
"kind": 1,
"tags": [
[
"p",
"af25fb6528023b99312a9d4643a9f13562a0c275df401567461a6217a21e4cab",
"wss://relay.mostr.pub"
],
[
"p",
"f6541a7c6f86a7a22d95858e84398094e9342d0dcec1c7ba3acfdfe9696fc995",
"wss://relay.mostr.pub"
],
[
"e",
"00ec833e50a4baeea90df4f092432f8ee0984e60e2d9d7fd6b57cc04ddfd139d",
"wss://relay.mostr.pub",
"reply"
],
[
"proxy",
"https://social.cologne/users/Zeugs/statuses/111432344800878113",
"activitypub"
]
],
"content": "nostr:npub14ujlkefgqgaejvf2n4ry8203x432psn4maqp2e6xrf3p0gs7fj4sqp4dcp do you you need about a few hundred gb of GPU ram if you want to use a single instance of an state of the art LLM Falcon 180b needs 720GB graphics memory. That's expensive. Even if you run it at fp16 it's 360 GB. And that's just a single instance(I have no idea how much clients you can supply with a single instance). You definitely need to talk to sales if you want some instances of cloud hardware that can handle such big lllm's.",
"sig": "7c23a15826511ec77dcf43a7dd8be37a754fd2f0646733934eeb54e60f836acefbfa700c1ddde352e4e865e7b8c1d0cf711bdf0afa59940a1c3441f3a5b51791"
}