rabble on Nostr: Yeah, I think many countries would consider ML generated CSAM to be the same thing as ...
Yeah, I think many countries would consider ML generated CSAM to be the same thing as an actual picture or video taken of a sexualized child.
UK for example:
https://www.bbc.com/news/uk-65932372And Australia it can be text, not just images or video:
https://www.lexology.com/library/detail.aspx?g=be791d54-9165-4233-b55a-4b9dad5d178dThe risk for most relay operators is that people will using nostr for the discovery / connection between people who then go in to other apps / servers to actually exchange the content. Apparently it’s a kind of whackamole with different hashtags people search for this kind of content.
Published at
2023-08-02 05:41:45Event JSON
{
"id": "6e6dd3a685d9fdd2103a54df9ee4409dee7e657ea3590262a258d216e3e8f110",
"pubkey": "76c71aae3a491f1d9eec47cba17e229cda4113a0bbb6e6ae1776d7643e29cafa",
"created_at": 1690954905,
"kind": 1,
"tags": [
[
"p",
"76c71aae3a491f1d9eec47cba17e229cda4113a0bbb6e6ae1776d7643e29cafa"
],
[
"e",
"a391ebb89680ac5ee36c0ff188642b85cf39198c87f25ee029f301d04f113dc6",
"",
"root"
]
],
"content": "Yeah, I think many countries would consider ML generated CSAM to be the same thing as an actual picture or video taken of a sexualized child. \n\nUK for example: https://www.bbc.com/news/uk-65932372\nAnd Australia it can be text, not just images or video: https://www.lexology.com/library/detail.aspx?g=be791d54-9165-4233-b55a-4b9dad5d178d\n\nThe risk for most relay operators is that people will using nostr for the discovery / connection between people who then go in to other apps / servers to actually exchange the content. Apparently it’s a kind of whackamole with different hashtags people search for this kind of content. ",
"sig": "0f3161e47298e3f8d1493cf2750e515fb5826ebe40ab869e7052883290e4a1ef64dd2edadbd491ce3782854aa16cc32ee068ce10068ec31024ebee53dfcae1fc"
}