Why Nostr? What is Njump?
2023-06-05 16:24:07

Emma Roth / theverge.space on Nostr: Twitter left up dozens of known images of child sexual abuse material, a new study ...

Twitter left up dozens of known images of child sexual abuse material, a new study says.

In the span of over two months, researchers at the Stanford Internet Observatory found more than 40 CSAM images that had been flagged in the past out of a pool of 100,000 tweets, as first reported by The Wall Street Journal.

Outgoing Twitter CEO Elon Musk has touted child safety as his number one priority since taking over the platform last year. However, numerous reports have since suggested Twitter isn’t doing all that great of a job at preventing the spread of child exploitation imagery.

twitter.com/stanfordio/status/1665724965407055872
Author Public Key
npub1zwh4gk4t8knws2rzhjulet4sycmqd4zxqp9zfmlr942fn032gnksedm3af