Nick Doty on Nostr: Referring to adding a new token to your `robots.txt` as an "opt-out" is strange. ...
Referring to adding a new token to your `robots.txt` as an "opt-out" is strange. Opting out generally means an ability to withdraw participation. But I haven't heard any generative AI training company suggest that they'll delete the data they crawled and update their models to make sure they're no longer using that content for inferences.
From a copyright -- and much more importantly, privacy -- perspective, does "we already trained on your data and will never forget it", seem like opting out?
Published at
2024-06-14 15:20:52Event JSON
{
"id": "5cedf03c66efdfa1fc29c4a028c8b142a515680281a269f6ec9d950e643dd3a7",
"pubkey": "3a816f122e8a7232789d56880556366a70fb0cf223e05682db3d2f6f01a09940",
"created_at": 1718378452,
"kind": 1,
"tags": [
[
"proxy",
"https://techpolicy.social/users/npdoty/statuses/112615650233895518",
"activitypub"
]
],
"content": "Referring to adding a new token to your `robots.txt` as an \"opt-out\" is strange. Opting out generally means an ability to withdraw participation. But I haven't heard any generative AI training company suggest that they'll delete the data they crawled and update their models to make sure they're no longer using that content for inferences.\n\nFrom a copyright -- and much more importantly, privacy -- perspective, does \"we already trained on your data and will never forget it\", seem like opting out?",
"sig": "04e7a8f69e32e3bbe8db8d6c2e69d3368917dbf08c6d3ebaa3a652c98573751b5e4542c5616970442355bb52942c0aacb23f8731fae1700f59d7544b116c2bac"
}