{
"id":"04de2cd0d1b9119ef0e2c4dd2274480c6f6ddfb5ea372ce51e2c49e155e99240",
"pubkey":"1807a49c19a1347e6f19729697c15d4f53df5482ecf3eeddfa0c8e7d0fa245a1",
"created_at":1741619720,
"kind":1,
"tags": [
[
"e",
"b82aad5f633e92840daf3034fd0dc93bbbc23940ed160188719eb48a3725db11",
"",
"mention"
]
],
"content":"There's only one solution: \n\nRun a personal, local agent. \n\nThe billion zillion GPUs are only necessary for training. You can fit a 30B-70B open-source model on a consumer GPU(s). \n\nIf people can afford $1,000+ gaming GPUs, then there's no excuse to not have a personal AI workstation.\n\nThe latency is also amazing. \n\nnostr:note1hq426hmr86fggrd0xq606rwf8wauyw2qa5tqrzr3n66g5de9mvgsc52v69",
"sig":"901e3989d0ef19f90fd259c845d844fc6e8949f4715559fe336a0def952205e69b3b214c66939212851f46def90b7cdbe81413d2ec086cb37e3344f3f78ba4a3"
}