S!ayer on Nostr: I have a PC - 6GB card and 16GB RAM. I mean, for the life of me I can't seem to find ...
I have a PC - 6GB card and 16GB RAM.
I mean, for the life of me I can't seem to find the right model to use. Llama seems great, in fact, imo it runs better than ChatGPT but that's because they have more nlp from Facebook/Meta platforms...
Is there a way to run lllama on local without the need for AWS or Azure or a 4090 GTX
Published at
2024-09-16 11:59:55Event JSON
{
"id": "881f8679e587732fb16d7964b64b0b160e897a0952393bb8d393989c5308387a",
"pubkey": "cdecc31c6d9406e9a7d6b0067412aa661d9d31c8035c3fd65c06301d1cac3b92",
"created_at": 1726487995,
"kind": 1,
"tags": [
[
"e",
"3ace45519383b4c02a690755dce5a353c9ba8c46bdf602645304e0a44cfe7c66",
"wss://offchain.pub/",
"root"
],
[
"e",
"0467925b4f3035eda1664712645c404d2d1dbb2a76a5152828679b25b4803a28",
"",
"reply"
],
[
"p",
"576d23dc3db2056d208849462fee358cf9f0f3310a2c63cb6c267a4b9f5848f9",
"",
"mention"
]
],
"content": "I have a PC - 6GB card and 16GB RAM. \n\nI mean, for the life of me I can't seem to find the right model to use. Llama seems great, in fact, imo it runs better than ChatGPT but that's because they have more nlp from Facebook/Meta platforms... \n\nIs there a way to run lllama on local without the need for AWS or Azure or a 4090 GTX",
"sig": "0ecf070a2173105fd2e2a33a0221232dc16496d017abd0355a7cebf830811cf0ed64f1e4896fac5e17fed522575329101f5bb048d3e94356cfb2a356fbb5d34d"
}