Alex Feinman on Nostr: These #AI prompts reportedly found in iOS betas are wild examples of wishful and/or ...
These #AI prompts reportedly found in iOS betas are wild examples of wishful and/or magical thinking on the part of engineers forced to include bullshit engines in their products.
"Do not hallucinate" lol.
People keep mistaking LLMs for a slightly dumb but well-meaning stage actor who just needs to be clearly and explicitly told what to do. Yeah, no.
https://www.macrumors.com/2024/08/06/apples-hidden-ai-prompts-discovered-in-macos-beta/Published at
2024-08-10 10:56:52Event JSON
{
"id": "80e8bf13d2380cf99ea9bf350e1ea1293672c1ca287cffbff92e10652262e349",
"pubkey": "c3ce7de317f7817b332273d756b0713488c5c44205ade22f96f2cc88e155083b",
"created_at": 1723287412,
"kind": 1,
"tags": [
[
"t",
"ai"
],
[
"proxy",
"https://wandering.shop/users/afeinman/statuses/112937363890138855",
"activitypub"
]
],
"content": "These #AI prompts reportedly found in iOS betas are wild examples of wishful and/or magical thinking on the part of engineers forced to include bullshit engines in their products.\n\n\"Do not hallucinate\" lol.\n\nPeople keep mistaking LLMs for a slightly dumb but well-meaning stage actor who just needs to be clearly and explicitly told what to do. Yeah, no.\n\nhttps://www.macrumors.com/2024/08/06/apples-hidden-ai-prompts-discovered-in-macos-beta/",
"sig": "589b4c039edacfc866cd6dcae11279a7200f9ec2d462aa70b985ebdab3bb2e1765f239fb0be4c2d8911aba6729a64e25c5b4a3b7dc291dedef785c6703d4f526"
}