Why Nostr? What is Njump?
2023-06-07 15:40:53
in reply to

Adam Back [ARCHIVE] on Nostr: 📅 Original date posted:2015-06-28 📝 Original message:On 28 June 2015 at 07:34, ...

📅 Original date posted:2015-06-28
📝 Original message:On 28 June 2015 at 07:34, Raystonn <raystonn at hotmail.com> wrote:
> nodes are limited to 133 connections. This is 8 outgoing connections and
> 125 incoming connections. [...] Once your full node reaches 133 connections,
> it will see no further increase in load [...] Only transaction rate will affect the
> load on your node.

The total system cost is more relevant, or total cost per user. I think you
are stuck on the O( t * m ) t = tx, m = nodes thinking. Total cost per user
is increasing. That better scaling algorithms need to be found. That's why
people are working on lightning-like systems.

> fear larger blocks based on an assumption of exponential growth of work, which just
> isn't the case.

People have been explaining quadratic system level increase, which is
not exponential,
wrong assumption.

> Decentralisation is planned to scale down once the 133 connection limit is
> hit. Like it or not, this is the current state of the code.

No people are not assuming decentralisation would decrease. They are assuming
the number of economically dependent full nodes would increase, that's where the
O( n^2 ) comes from! If we assume say c= 0.1% of users will run full nodes,
and users make some small-world assumed number of transactions that doesnt
increase greatly as more users are added to the network, then O( t * m
) => O( n^2 ).

Seeing decentralisation failing isn't a useful direction as Bitcoin depends on
decentralisation for most of it's useful security properties. People running
around saying great lets centralise Bitcoin and scale it, are not working on
Bitcoin. They may more usefully go work on competing systems without
proof of work as that's where this line of reasoning ends up. There
are companies working on such things. Some of them support Bitcoin IOUs.
Some of them have job openings.

We can improve decentralisation, and use bandwidth and relay improvements
to get some increase in throughput. But starting a direction of simplistic
thinking about an ever increasing block-size mode of thinking is destructive
and not Bitcoin. If you want to do that, you need to do it in an offchain
system. You cant build on sand so your offchain system wont be useful
if Bitcoin doesnt have reasonable decentralisation to retain useful meaning.
Hence lightning. There are existing layer 2 things that have on-chain netting.
Go work on one of those. But people need to understand the constraints
and stop arguing to break Bitcoin to "scale". It's too simplistic.

Even Gavin's proposal is not trying to do that, hence reference to
Nielsen's law.
His parameters are too high for too long for basic safety or prudence, but the
general idea to reclaim some throughput from network advances, is reasonable.
Also decentralisation is key, and that is something we can improve with pooling
protocols to phase out the artificial centralisation. We can also
educate people
to use fullnode they economically depend on to keep the full to SPV ratio
reasonable which is also needed for security.

Adam
Author Public Key
npub1ac86vemj7ce5z8jyxt39rna3tvwql6xd30ha3vxcd6esysp23d9qrlswfj