nostr.build on Nostr: Hi praxeologist, good question. We use an AI model designed to flag and report CSAM. ...
Hi praxeologist, good question. We use an AI model designed to flag and report CSAM. If it is not 90% sure, we will manually review the media. CSAM is removed and reported to the NCMEC. The account is then blocked from using our service.
We also filter media that suggests anything inappropriate with a child including cartoons, adult clothing, positions, etc. We have recently stopped allowing free adult porn (intercourse) uploads.
We have experimented with the Cloudflare filter and Microsoft PhotoDNA, but find the AI model is most accurate.
More info can be found here:
https://nostr.build/features/Published at
2025-06-02 15:53:59Event JSON
{
"id": "22f43afe952ee97ef666040052ca2cc7be5df303c5210c1dc95b2f6a6492c12a",
"pubkey": "9989500413fb756d8437912cc32be0730dbe1bfc6b5d2eef759e1456c239f905",
"created_at": 1748879639,
"kind": 1,
"tags": [
[
"e",
"4236e8a8d945974382c8e715e3e6e8eca79b925f29dc941305d6ad44c70f5f84",
"",
"root"
],
[
"p",
"8fb140b4e8ddef97ce4b821d247278a1a4353362623f64021484b372f948000c"
],
[
"p",
"eaaf8daada728bf6231443ac989f952f0adb657b7b9b8110e4a683706b938e5c"
],
[
"r",
"https://nostr.build/features/"
]
],
"content": "Hi praxeologist, good question. We use an AI model designed to flag and report CSAM. If it is not 90% sure, we will manually review the media. CSAM is removed and reported to the NCMEC. The account is then blocked from using our service.\nWe also filter media that suggests anything inappropriate with a child including cartoons, adult clothing, positions, etc. We have recently stopped allowing free adult porn (intercourse) uploads.\n\nWe have experimented with the Cloudflare filter and Microsoft PhotoDNA, but find the AI model is most accurate.\n\nMore info can be found here:\nhttps://nostr.build/features/",
"sig": "4c9f4442257638fc6313a2d22b2cfcdcc3880de5508d3a4a31a9229af4c45fab47ec99b0966c55f446c991fd4d8c2c1a3b9ab732ccc3382c20224d67ee122ad1"
}