Jorge Tim贸n [ARCHIVE] on Nostr: 馃搮 Original date posted:2015-07-30 馃摑 Original message:On Thu, Jul 30, 2015 at ...
馃搮 Original date posted:2015-07-30
馃摑 Original message:On Thu, Jul 30, 2015 at 10:53 PM, Tom Harding <tomh at thinlink.com> wrote:
> On 7/30/2015 11:14 AM, Jorge Tim贸n wrote:
>> The blocksize limit (your "production quota") is necessary for
>> decentralization, not for having a functioning fee market.
>
>> If we can agree that hitting the limit will JUST cause higher fees and
>> not bitcoin to fail, puppies to die or the sky to turn purple I think
>> that's a great step forward in this debate.
>
> It's interesting how people see things differently. I think your first
> statement above represents a great step forward in the debate. Unlike
> Adam Back, you state that a block size limit is not necessary to create
> a functioning fee market.
Yes, Adam Back and I some times see things differently, and that's fine.
Many times, we realize later that we're saying the same thing with
different words and we're just discussing about terminology. That's
not an exclusive problem we have, it's a universal communication
problem. That's why math (which is nothing but a language) was
invented: to never discuss about terminology, to force any used
concept to be defined beforehand.
Sorry for the distraction, but I think this is one of those times.
Whether "hitting the limit" is "necessary" (I bet he never said
"strictly necessary") or just "helpful" is not very relevant. I think
Adam and I agree that hitting the limit wouldn't be bad, but actually
good for an young and immature market like bitcoin fees.
Apart from the dubious time-preference premium (dubious because in
most cases is just wallet's defaults and not users in a hurry),
transactions are basically free if you are willing to wait (and
apparently not that much).
If I was a miner and you want me to include your transaction for free,
you're asking me to give you money, which I would prefer to do
directly if you are a friend or a non-profit organization that I like
or whatever rather than giving you money through bitcoin fee
discounts.
By including your transaction, I'm increasing the probability of my
mined block being orphaned and you're not willing to give me even a
single satoshi in exchange.
Today, in practice, one satoshi fee and no fee at all are treated
exactly the same by most (maybe all?) miners, which if you ask me, I
find very ~~unfair~~ economically absurd.
Are all miners just stupid?
Not necessarily, they just don't care about fees or transactions at all.
Who is to blame? Certainly not the value chosen for the block size
limit, it's clearly the subsidy's fault: subsidy is all miners care
about (by the way, that's also the illness behind the SPV-mining
symptom). I am very worried about excessive mining subsidies (if you
knew how worried the freicoin community was [and still is] about this
problem, even if freicoin probably has one of the lowest mining
subsidies out there [currently and perpetually annual 5% of the
monetary base]...).
And I think that "hitting the limit" is not a catastrophe at all, but
rather an opportunity to motivate miners to start caring more about
transactions and fees (in proportion to what they care about).
And if the limit is increased later and fees fall again, that's fine,
because miners' will already be more prepared for the next time we
"hit the limit".
Anyway, maybe that hope is irrelevant, but what I'm convinced about is
that rising non-fast-confirmation mining fees above zero is not a bad
thing.
> As to your second statement, unfortunately for immediate harmonious
> relations, I was merely separating out the elevated fee market concern,
> not at all saying it is the only or even the biggest concern with
> limited capacity. Alan Reiner, Ryan X. Charles and others have
> eloquently explained how restrictive a 1MB limit is, even with "layer 2".
To be honest, I've only followed those were assuming the worse case
for optimization: bitcoin global monetary monopoly.
If I remember correctly, they were aiming for something around 170 MB,
but in any case, any value for the constant is completely arbitrary to
me at this point, including 1 MB. I'm deeply offended when I feel
included in the "1MBers group" because I don't feel like that at all.
To be honest, I have no idea what the correct value should be, all I
know it's a trade-off in a monotonic function:
f(blocksize) = decentralization
> What's missing from the decentralization dialog is a quantitative
> measure of decentralization.
You are completely right: there's no defined measurable unit for
"decentralization" ("p2pness", whatever bitcoin has that wasn't
possible before pow-based distributed consensus).
And I'm afraid we will never have such a measurable unit. Maybe the
best definition of the property we're trying to capture is just "the
opposite of centralization", assuming centralization is easier to
define.
The best we have now are pool percentages, number of nodes,
subsidy/fee ratio (as said, this influences things like SPV mining)
How all that gets to...?
g(many unrelated matrics) = centralization
I don't really think anybody knows, but no matter what your
interpretation of some Japanese-named dude on the internet's words
(aka bitcoin sacred history) are, if you think 3 validating nodes is
enough for a "p2p" monetary network.
It is very possible that decentralization(blocksize) =
decentralization(blocksize+1) for many values of blocksize, but I
think the burden of the proof that decentralization(current_blocksize)
~= decentralization(current_blocksize+1) is on those who propose
++current_blocksize.
But I think ANY metric for centralization would be welcomed right now.
In fact, it doesn't need to be a function of blocksize, it can be a
function of maxBlockSigops or maybe even maxBlockInputs or
maxBlockOuputs.
But if we don't want to have any consensus limit to centralization
bitcoin has already fail (and doesn't need expensive proof of work).
> Why not slam users with higher fees now, if we accept that they may be
> necessary someday? For the same reasons you don't ask a child, age 5, to
> work in a factory.
It is a certainty that fees will be necessary someday: bitcoin's
seigniorage is limited to 21 M to subsidize mining, and we know that
won't last forever. Expensive proof of work (that centralized systems
lack) must be paid for somehow.
Who's child am I asking to work in a factory? I feel I'm missing
something there.
Published at
2023-06-07 15:44:10Event JSON
{
"id": "2f67b4afa0dd1407d3a92ed6885bcef6ecb630d08c97c1981aede60a1ea287fd",
"pubkey": "498a711971f8a0194289aee037a4c481a99e731b5151724064973cc0e0b27c84",
"created_at": 1686152650,
"kind": 1,
"tags": [
[
"e",
"f50f9a9f011937f80dce086e8d3558ea890e1242174d35cd681a942d8400d509",
"",
"root"
],
[
"e",
"f4bf2154fb0c4dc3e0f33ad686e467a93771dd2774b4a3fab391f64a5d39c5e1",
"",
"reply"
],
[
"p",
"dc329a02c970aabf03b87185ef51c86afe4586fe3a148508af898af3fabc56a3"
]
],
"content": "馃搮 Original date posted:2015-07-30\n馃摑 Original message:On Thu, Jul 30, 2015 at 10:53 PM, Tom Harding \u003ctomh at thinlink.com\u003e wrote:\n\u003e On 7/30/2015 11:14 AM, Jorge Tim贸n wrote:\n\u003e\u003e The blocksize limit (your \"production quota\") is necessary for\n\u003e\u003e decentralization, not for having a functioning fee market.\n\u003e\n\u003e\u003e If we can agree that hitting the limit will JUST cause higher fees and\n\u003e\u003e not bitcoin to fail, puppies to die or the sky to turn purple I think\n\u003e\u003e that's a great step forward in this debate.\n\u003e\n\u003e It's interesting how people see things differently. I think your first\n\u003e statement above represents a great step forward in the debate. Unlike\n\u003e Adam Back, you state that a block size limit is not necessary to create\n\u003e a functioning fee market.\n\nYes, Adam Back and I some times see things differently, and that's fine.\nMany times, we realize later that we're saying the same thing with\ndifferent words and we're just discussing about terminology. That's\nnot an exclusive problem we have, it's a universal communication\nproblem. That's why math (which is nothing but a language) was\ninvented: to never discuss about terminology, to force any used\nconcept to be defined beforehand.\nSorry for the distraction, but I think this is one of those times.\nWhether \"hitting the limit\" is \"necessary\" (I bet he never said\n\"strictly necessary\") or just \"helpful\" is not very relevant. I think\nAdam and I agree that hitting the limit wouldn't be bad, but actually\ngood for an young and immature market like bitcoin fees.\nApart from the dubious time-preference premium (dubious because in\nmost cases is just wallet's defaults and not users in a hurry),\ntransactions are basically free if you are willing to wait (and\napparently not that much).\n\nIf I was a miner and you want me to include your transaction for free,\nyou're asking me to give you money, which I would prefer to do\ndirectly if you are a friend or a non-profit organization that I like\nor whatever rather than giving you money through bitcoin fee\ndiscounts.\nBy including your transaction, I'm increasing the probability of my\nmined block being orphaned and you're not willing to give me even a\nsingle satoshi in exchange.\nToday, in practice, one satoshi fee and no fee at all are treated\nexactly the same by most (maybe all?) miners, which if you ask me, I\nfind very ~~unfair~~ economically absurd.\n\nAre all miners just stupid?\nNot necessarily, they just don't care about fees or transactions at all.\nWho is to blame? Certainly not the value chosen for the block size\nlimit, it's clearly the subsidy's fault: subsidy is all miners care\nabout (by the way, that's also the illness behind the SPV-mining\nsymptom). I am very worried about excessive mining subsidies (if you\nknew how worried the freicoin community was [and still is] about this\nproblem, even if freicoin probably has one of the lowest mining\nsubsidies out there [currently and perpetually annual 5% of the\nmonetary base]...).\nAnd I think that \"hitting the limit\" is not a catastrophe at all, but\nrather an opportunity to motivate miners to start caring more about\ntransactions and fees (in proportion to what they care about).\nAnd if the limit is increased later and fees fall again, that's fine,\nbecause miners' will already be more prepared for the next time we\n\"hit the limit\".\nAnyway, maybe that hope is irrelevant, but what I'm convinced about is\nthat rising non-fast-confirmation mining fees above zero is not a bad\nthing.\n\n\u003e As to your second statement, unfortunately for immediate harmonious\n\u003e relations, I was merely separating out the elevated fee market concern,\n\u003e not at all saying it is the only or even the biggest concern with\n\u003e limited capacity. Alan Reiner, Ryan X. Charles and others have\n\u003e eloquently explained how restrictive a 1MB limit is, even with \"layer 2\".\n\nTo be honest, I've only followed those were assuming the worse case\nfor optimization: bitcoin global monetary monopoly.\nIf I remember correctly, they were aiming for something around 170 MB,\nbut in any case, any value for the constant is completely arbitrary to\nme at this point, including 1 MB. I'm deeply offended when I feel\nincluded in the \"1MBers group\" because I don't feel like that at all.\nTo be honest, I have no idea what the correct value should be, all I\nknow it's a trade-off in a monotonic function:\n\nf(blocksize) = decentralization\n\n\u003e What's missing from the decentralization dialog is a quantitative\n\u003e measure of decentralization.\n\nYou are completely right: there's no defined measurable unit for\n\"decentralization\" (\"p2pness\", whatever bitcoin has that wasn't\npossible before pow-based distributed consensus).\nAnd I'm afraid we will never have such a measurable unit. Maybe the\nbest definition of the property we're trying to capture is just \"the\nopposite of centralization\", assuming centralization is easier to\ndefine.\nThe best we have now are pool percentages, number of nodes,\nsubsidy/fee ratio (as said, this influences things like SPV mining)\nHow all that gets to...?\n\ng(many unrelated matrics) = centralization\n\nI don't really think anybody knows, but no matter what your\ninterpretation of some Japanese-named dude on the internet's words\n(aka bitcoin sacred history) are, if you think 3 validating nodes is\nenough for a \"p2p\" monetary network.\nIt is very possible that decentralization(blocksize) =\ndecentralization(blocksize+1) for many values of blocksize, but I\nthink the burden of the proof that decentralization(current_blocksize)\n~= decentralization(current_blocksize+1) is on those who propose\n++current_blocksize.\nBut I think ANY metric for centralization would be welcomed right now.\nIn fact, it doesn't need to be a function of blocksize, it can be a\nfunction of maxBlockSigops or maybe even maxBlockInputs or\nmaxBlockOuputs.\nBut if we don't want to have any consensus limit to centralization\nbitcoin has already fail (and doesn't need expensive proof of work).\n\n\u003e Why not slam users with higher fees now, if we accept that they may be\n\u003e necessary someday? For the same reasons you don't ask a child, age 5, to\n\u003e work in a factory.\n\nIt is a certainty that fees will be necessary someday: bitcoin's\nseigniorage is limited to 21 M to subsidize mining, and we know that\nwon't last forever. Expensive proof of work (that centralized systems\nlack) must be paid for somehow.\nWho's child am I asking to work in a factory? I feel I'm missing\nsomething there.",
"sig": "b47515da8ee1bf1a00dc45dc7cc87e9110efa886b54a08e7241c2891a97d4965beea84825f9e34848f74160f4ee0a66e9031c0e58b263a64a03c9722bd21704a"
}