Emma Roth / theverge.space on Nostr: Twitter left up dozens of known images of child sexual abuse material, a new study ...
Twitter left up dozens of known images of child sexual abuse material, a new study says.
In the span of over two months, researchers at the Stanford Internet Observatory found more than 40 CSAM images that had been flagged in the past out of a pool of 100,000 tweets, as first reported by The Wall Street Journal.
Outgoing Twitter CEO Elon Musk has touted child safety as his number one priority since taking over the platform last year. However, numerous reports have since suggested Twitter isn’t doing all that great of a job at preventing the spread of child exploitation imagery.
twitter.com/stanfordio/status/1665724965407055872
Published at
2023-06-05 16:24:07Event JSON
{
"id": "e02a5051a229cb16641c86fbd2065e1bfd75d8e2ab5a274cb029413a947beb6a",
"pubkey": "13af545aab3da6e82862bcb9fcaeb0263606d446004a24efe32d5499be2a44ed",
"created_at": 1685982247,
"kind": 1,
"tags": [
[
"mostr",
"https://theverge.space/actor/emma_roth/note/91ecd243-bf6d-4b6d-a9ce-135e78e63875"
]
],
"content": "Twitter left up dozens of known images of child sexual abuse material, a new study says.\n\nIn the span of over two months, researchers at the Stanford Internet Observatory found more than 40 CSAM images that had been flagged in the past out of a pool of 100,000 tweets, as first reported by The Wall Street Journal.\n\nOutgoing Twitter CEO Elon Musk has touted child safety as his number one priority since taking over the platform last year. However, numerous reports have since suggested Twitter isn’t doing all that great of a job at preventing the spread of child exploitation imagery.\n\ntwitter.com/stanfordio/status/1665724965407055872",
"sig": "8570536c0e385489a3859d0043e8bf1f5a85c53acf5f73b1d5bf3fa4cb369437da43d45312406f987e070039056a85122521ae21d94e10826cfc0d52f8731f84"
}