Alex Stamos on Nostr: The Stanford Internet Observatory team has updated our work earlier this year on ...
The Stanford Internet Observatory team has updated our work earlier this year on Self-Generated Child Sexual Abuse Materials.
https://cyber.fsi.stanford.edu/io/news/update-sg-csam-ecosystem Some key takeaways:
1) X’s elimination of academic API access has seriously hampered our ability to fight child exploitation on the platform. Using alternative means, we still found 81 accounts openly selling CSAM on Twitter, as well as advertisements from large companies running beside them.
Published at
2023-09-22 13:52:21Event JSON
{
"id": "a01e551bf35a94bade3bdd922c621f5d0197a506e6cd111de153f9b6b9cc6d89",
"pubkey": "cd1e48c145a46c437637a19149049827a77286aeeef568566ef0399870bde650",
"created_at": 1695390741,
"kind": 1,
"tags": [
[
"proxy",
"https://cybervillains.com/users/alex/statuses/111109127628632804",
"activitypub"
]
],
"content": "The Stanford Internet Observatory team has updated our work earlier this year on Self-Generated Child Sexual Abuse Materials. \n\nhttps://cyber.fsi.stanford.edu/io/news/update-sg-csam-ecosystem \n\nSome key takeaways:\n\n1) X’s elimination of academic API access has seriously hampered our ability to fight child exploitation on the platform. Using alternative means, we still found 81 accounts openly selling CSAM on Twitter, as well as advertisements from large companies running beside them.",
"sig": "3b7674473dcf6c2ed42b3aa932c421446b9eceddb7b3fb6f16f0892479282547692a9854c099f62c0785ba1614cb8934d1c721d6fd18c716853bdf8fe916c773"
}