📅 Original date posted:2023-01-01
🗒️ Summary of this message: A proposal for a halving system based on average difficulty is suggested, but it does not address the immediate danger of profit margins and hashing power. A demurrage soft-fork may be a more plausible solution.
📝 Original message:Is a storage fee averaged out over many future blocks - but not hardcoded value and regulated by a free market?
The problem with demurrage I see is that the fee is taken when you spend. There is no additional income for miners if people are still hoarding.
In tail emission even if people are still hoarding - the fee is taken immediately and is distributed to miners.
We have a hope there is still the global adoption ahead (most of countries are like El Salvador). It may increase price and marketcap of Bitcoin by order of magnitude.
And that's why hoarding in demurrage may still exist: due to extremely appealing long-term risk/reward (i.e. relatively small, delayed tax versus huge possible profit)
W dniu 2022-12-31 00:29:08 użytkownik Peter Todd <pete at petertodd.org> napisał:
> On Fri, Dec 23, 2022 at 07:43:36PM +0100, jk_14 at op.pl wrote:
>
> Necessary or not - it doesn't hurt to plan the robust model, just in case. The proposal is:
>
> Let every 210,000 the code calculate the average difficulty of 100 last retargets (100 fit well in 210,000 / 2016 = 104.166)
> and compare with the maximum of all such values calculated before, every 210,000 blocks:
>
>
> if average_diff_of_last_100_retargets > maximum_of_all_previous_average_diffs
> do halving
> else
> do nothing
>
>
> This way:
>
> 1. system cannot be played
> 2. only in case of destructive halving: system waits for the recovery of network security
First of all - while I suspct you already understand this issue - I should
point out the following:
The immediate danger we have with halvings is that in a competitive market,
profit margins tend towards marginal costs - the cost to produce an additional
unit of production - rather than total costs - the cost necessary to recover
prior and future expenses. Since the halving is a sudden shock to the system,
under the right conditions we could have a significant amount of hashing power
just barely able to afford to hash prior to the halving, resulting in all that
hashing power immediately having to shut down and fees increasing dramatically,
and likely, chaotically. Your proposal does not address that problem as it can
only measure difficulty prior to the halving point.
Other than that problem, I agree that this proposal would, at least in theory,
be a positive improvement on the status quo. But it is a hard fork and I don't
think there is much hope for such hard forks to be implemented. I believe that
a demmurrage soft-fork, implemented via a storage fee averaged out over many
future blocks, has a much more plausible route towards implementation.
--
https://petertodd.org 'peter'[:-1]@petertodd.org