SimplifiedPrivacy.com on Nostr: Who decides what to censor as spam? Nostr and Lens solve the "spam and scam" problem ...
Who decides what to censor as spam?
Nostr and Lens solve the "spam and scam" problem by having the client decide. For example Amethyst for android will hide posts from accounts that others report as scams. These "others" are defined by people you follow, but this essentially puts it up to a community vote of large influencers to silence you.
On Lens, once you're labeled spam, you appear in the "show more" of comments. This is a huge turn-off to new users with no followers, who are treated like lower class citizens.
Farcaster solves it in a similar way, but by having the official team label it, and then since their client is so large and influential, their list is often distributed to other clients. This is absolutely horrible and way too centralized. While it's true that posts to your followers would still show up, they are effectively silencing your comments.
Session has zero censorship for mass DMs in the way I use it, even under outright sanctions. The nodes don't even know I am the sender, and I'm assigned new receivers if they drop me. That's why I like it. But the market likes simpleX more because it rotates encryption keys, so it's tough to get new followers. Can't fight the market.
Bastyon solves the problem by a community vote for outright illegal content, to get it off the nodes, such as child porn and narcotics sales. The voters are picked based on their total upvotes, called "reputation". I disagree with this approach, as if we're going to vote, it should be the nodes hosting it (like Arweave does)...
Files on Arweave have an unofficial vote, where the nodes can opt out of storing it. And if all the miners chosen in a block opt out, then there's no financial penalty for dropping the content. But if they have the content and others don't, then they have a financial advantage to mine that block over competitors. This approach is good for websites, but for a social network with permissionless replies, it's way too passive.
Therefore:
I disagree with all these solutions.
In my view, the best way to handle spam (in a permissionless system) is to allow the original poster to decide which replies are spam. Then the end user can decide to toggle on or off "criticism and spam" for the replies. After all, if you're following someone, you trust their judgment on the subject they are speaking about. And this decentralizes the decision to each individual poster.
Now I do the ironic thing, and turn it over to my replies. Do you think this approach is right?
Published at
2024-08-24 00:06:09Event JSON
{
"id": "5636f153864ca9d13ed3d46433d28ea5cc64cfa9627f2f8691e382f6b393c7b7",
"pubkey": "ac3f6afe17593f61810513dac9a1e544e87b9ce91b27d37b88ec58fbaa9014aa",
"created_at": 1724457969,
"kind": 1,
"tags": [],
"content": "Who decides what to censor as spam?\n\nNostr and Lens solve the \"spam and scam\" problem by having the client decide. For example Amethyst for android will hide posts from accounts that others report as scams. These \"others\" are defined by people you follow, but this essentially puts it up to a community vote of large influencers to silence you.\n\nOn Lens, once you're labeled spam, you appear in the \"show more\" of comments. This is a huge turn-off to new users with no followers, who are treated like lower class citizens.\n\nFarcaster solves it in a similar way, but by having the official team label it, and then since their client is so large and influential, their list is often distributed to other clients. This is absolutely horrible and way too centralized. While it's true that posts to your followers would still show up, they are effectively silencing your comments.\n\nSession has zero censorship for mass DMs in the way I use it, even under outright sanctions. The nodes don't even know I am the sender, and I'm assigned new receivers if they drop me. That's why I like it. But the market likes simpleX more because it rotates encryption keys, so it's tough to get new followers. Can't fight the market.\n\nBastyon solves the problem by a community vote for outright illegal content, to get it off the nodes, such as child porn and narcotics sales. The voters are picked based on their total upvotes, called \"reputation\". I disagree with this approach, as if we're going to vote, it should be the nodes hosting it (like Arweave does)...\n\nFiles on Arweave have an unofficial vote, where the nodes can opt out of storing it. And if all the miners chosen in a block opt out, then there's no financial penalty for dropping the content. But if they have the content and others don't, then they have a financial advantage to mine that block over competitors. This approach is good for websites, but for a social network with permissionless replies, it's way too passive.\n\nTherefore:\nI disagree with all these solutions.\n\nIn my view, the best way to handle spam (in a permissionless system) is to allow the original poster to decide which replies are spam. Then the end user can decide to toggle on or off \"criticism and spam\" for the replies. After all, if you're following someone, you trust their judgment on the subject they are speaking about. And this decentralizes the decision to each individual poster.\n\nNow I do the ironic thing, and turn it over to my replies. Do you think this approach is right?",
"sig": "06781bb2bd2ad74478ab573f0c6658995ef2cab8cbf7d0f63b5b1d2dea9527fa95dfb1a70426a0d9b20c445ea9e866ad56b42043c36a56cd62311c92f73738de"
}