Nazo on Nostr: Not really "now available." Local LLM inferencing has been available for quite a long ...
Not really "now available." Local LLM inferencing has been available for quite a long time now. My personal preference being KoboldCPP since it's leaner and simpler than most.
I suggest anything that runs on Python like that should run through Miniconda or Anaconda. Despite the venv, they make a real mess of things sometimes and distros are now moving away from the user using pip natively anyway. Also most prefer Python 3.10 which is horribly outdated anyway.
Published at
2024-06-14 21:27:51Event JSON
{
"id": "cc14fd72024daea020772f9666e780255fe3032b2aa8b84d17df5e09c2fca2a5",
"pubkey": "d8dc6ebef12663fd99389022a13a59d3a688887434f51df750e9935566e14384",
"created_at": 1718400471,
"kind": 1,
"tags": [
[
"e",
"ff12b96f8a29139fba6519e9ec1748253949cf098032067a9f9b9d26e9882c03",
"",
"root"
],
[
"proxy",
"https://mastodon.social/@nazokiyoubinbou/112617093285705132",
"web"
],
[
"p",
"cee38a9f17a122c9ea7190d2381ff3760fb231bfd75c7f25f5ec088fbaf65170"
],
[
"proxy",
"https://mastodon.social/users/nazokiyoubinbou/statuses/112617093285705132",
"activitypub"
],
[
"L",
"pink.momostr"
],
[
"l",
"pink.momostr.activitypub:https://mastodon.social/users/nazokiyoubinbou/statuses/112617093285705132",
"pink.momostr"
]
],
"content": "Not really \"now available.\" Local LLM inferencing has been available for quite a long time now. My personal preference being KoboldCPP since it's leaner and simpler than most.\n\nI suggest anything that runs on Python like that should run through Miniconda or Anaconda. Despite the venv, they make a real mess of things sometimes and distros are now moving away from the user using pip natively anyway. Also most prefer Python 3.10 which is horribly outdated anyway.",
"sig": "219494b669efd5e598be3f94647864afeab4815d1e456cce8247d0c0d57bfc5137fa960c0323a397e734c7e0906c7a4be7d07826e2fdb40bc051f927ea88e871"
}