Why Nostr? What is Njump?
2024-08-10 10:56:52

Alex Feinman on Nostr: These #AI prompts reportedly found in iOS betas are wild examples of wishful and/or ...

These #AI prompts reportedly found in iOS betas are wild examples of wishful and/or magical thinking on the part of engineers forced to include bullshit engines in their products.

"Do not hallucinate" lol.

People keep mistaking LLMs for a slightly dumb but well-meaning stage actor who just needs to be clearly and explicitly told what to do. Yeah, no.

https://www.macrumors.com/2024/08/06/apples-hidden-ai-prompts-discovered-in-macos-beta/
Author Public Key
npub1c088mcch77qhkvezw0t4dvr3xjyvt3zzqkk7ytuk7txg3c24pqaskgvjxk