Why Nostr? What is Njump?
2024-06-14 15:20:52

Nick Doty on Nostr: Referring to adding a new token to your `robots.txt` as an "opt-out" is strange. ...

Referring to adding a new token to your `robots.txt` as an "opt-out" is strange. Opting out generally means an ability to withdraw participation. But I haven't heard any generative AI training company suggest that they'll delete the data they crawled and update their models to make sure they're no longer using that content for inferences.

From a copyright -- and much more importantly, privacy -- perspective, does "we already trained on your data and will never forget it", seem like opting out?
Author Public Key
npub182qk7y3w3fery7ya26yq243kdfc0kr8jy0s9dqkm85hk7qdqn9qqk5ajes