Why Nostr? What is Njump?
2024-10-13 14:31:53

constant on Nostr: Hello Nostr, if you are in a great mood just skip this post; its depressing. So I had ...

Hello Nostr, if you are in a great mood just skip this post; its depressing.

So I had not encountered it before, but yesterday I crossed paths with Child Sexual Abuse Material on Nostr. In my regular internet usage over the years I have rarely come across this stuff, though I guess if I were to look for it I would find it eventually;
That is to say, the status quo is that it does exist, but most people most of the time wont have to deal with it. I think this is important to realize that the world is not perfect as it is, when reflecting on these matters in the context of Nostr.

It goes without saying, but just to be clear: yes I think we should all learn how to tie nooses and identify adequate oak trees.

However marginalized CSAM is, some people want governments to go above and beyond to combat it. Prime example currently is the ‘Chat control’ regulation proposed out of the EU, which wants to install bigbrother client side on your phone to scan every single thing you do in order to flag any suspicious behavior/content, before it gets encrypted. How understandable the motivation might be, even advocacy groups and agencies dealing with the CSAM problem are against this type of stuff, if not just simply because they are already swamped with work/processing of material as it is; opening the floodgates with false positives wont help anything and probably make the situation worse. Aside from the obvious objections to forcibly installing big brother on peoples hardware of course.

Back to Nostr. On the one hand we have the end-user, that does not want to get confronted by this material. From this perspective, CSAM is just one of the many things a user might want to filter out, along with other material that might not be illegal per se but just NSFW etc. Whatever means we find to do this, failure by those mechanisms to do so is bad, unwanted etc. but not a direct systemic risk to Nostr; like I mentioned in the beginning, it is not impossible to accidentally come across this type of stuff on the internet today as is, and the whole world is still using it.

But it does become a systemic issue from the relay perspective. Here, it is not some incidental bad experience that can be clicked away. It is a crime to host this type of material which brings in the risk of prosecution for ‘simply running a relay’ that some asshole decided to nuke with CSAM or other illegal material.

But here my optimism comes in. Nostr is pro censorship; the theory is that every relay can moderate to their hearts content, because users are ultimately always able to route around such obstacles (very much like ‘the internet’ itself). This means that that relays should be able to adjust their policies and methods of moderation to their capacity to deal with unwanted content and risk appetite. From a locked down white-list only relay on one side of the spectrum, all the way to an open relay with heavy sophisticated analytics for assessment and filtering, and everything in between: albeit that it wont deliver us a perfect solution in all cases, it will remove the dark cloud of systemic risk to the protocol/network, because we are able to sufficiently marginalize the phenomena.

On a last note: when talking about filtering/assessing for this content it gets complicated really quickly. You can imagine some AI performing such a task, or using lists of known content to filter; however you want to do it, you first come to the question on how you construct that stuff in the first place; it requires gathering such content and human eyes looking at it. And then subsequently you produce tooling that can be flipped around and used as a search engine to seek and find such material instead of filtering it away. So yeah, there are no graceful perfect solutions I am afraid.

Well, there is one of course….
https://cdn.satellite.earth/a92bdd80dbd45e00636a9db615061eef168c3164a0e1bfa1abfb0784e74cd24e.mp3
Author Public Key
npub1t6jxfqz9hv0lygn9thwndekuahwyxkgvycyscjrtauuw73gd5k7sqvksrw