garden_variety_jess on Nostr: Exactly. I could even see there being a rolling calculation like “past-30-days ...
Exactly. I could even see there being a rolling calculation like “past-30-days trust rating” or whatever (would be really cool if the user could set and change those parameters in the client). There are lots of ways to increase trust without censoring and a big part of that includes accountability and transparency while putting the user experience into the hands of the user themselves.
User reviews - negative and positive - are utilized in e-commerce for the common good. Why do we not also value this kind of “credentialing” so to speak in terms of the quality of our social interactions? Just food for thought.
Except that it works not as a universal score imposed by a government or corporate platform. Instead it’s a web of trust score from your perspective as a user. I think apps should by default include a moderation calculator for a user but also a service for new users. A kind of default they can change to use a different set of moderation decisions or none at all.
All platforms already have this scoring system, nostr does too, we just are using it at the relay and app level for spam. The point isn’t to have no-moderation, trust or safety on a protocol. The point is that users should have agency. If you control your identity and can use multiple apps, and can choose what moderation judgements you want, then that’s no censorship. That’s user empowerment.
Nostr has no singletons… no single directory, no registry, no point of control. Users own their identity and they can choose which clients they use and those can choose which relays they pull data from. This is the freedom we need.
Users should also be able to have private accounts, choose who can see their content, and what kind of rules they want to follow. Having functional opt-in moderation options is important. A web of trust score is an important way of achieving that end.
A system which is 4chan like without any ability to create personal safety will only appeal to a very small and privileged subset of internet users. It’s fine for something like that to exist, but Nostr can be so much more than that.
Published at
2024-04-01 22:56:05Event JSON
{
"id": "4615e5b494f1e605226cd822be06ed975b87b4140cc79f1d915e49743b87c834",
"pubkey": "19dcd48f846e6623d5264e601c41dcbe184eacdae6d6da191cc9b81a97947bcd",
"created_at": 1712012165,
"kind": 1,
"tags": [
[
"q",
"a4c70d2f22415b1ba7b23ad4bc805e12fe2113f63f6cf43cd8d9194bfc7e515c"
],
[
"p",
"76c71aae3a491f1d9eec47cba17e229cda4113a0bbb6e6ae1776d7643e29cafa"
],
[
"p",
"76c71aae3a491f1d9eec47cba17e229cda4113a0bbb6e6ae1776d7643e29cafa"
]
],
"content": "Exactly. I could even see there being a rolling calculation like “past-30-days trust rating” or whatever (would be really cool if the user could set and change those parameters in the client). There are lots of ways to increase trust without censoring and a big part of that includes accountability and transparency while putting the user experience into the hands of the user themselves. \n\nUser reviews - negative and positive - are utilized in e-commerce for the common good. Why do we not also value this kind of “credentialing” so to speak in terms of the quality of our social interactions? Just food for thought. nostr:note15nrs6tezg9d3hfaj8t2teqz7ztlzzylk8ak0g0xcmyv5hlr729wq45whyz",
"sig": "37cd155c5e5efa3138298e25630dd46bb4013ac6c947e180583e751d1e8cc4cbb64e10e5011b886943452b6c7de31b80d3eb3064978e5bdb2fab3309fcb3ea0b"
}