Peter's proposal undercuts matching blocksize growth to technological progress not limiting centralization pressure.  They are somewhat related, but I want to be clear on what I originally stated.  I would also point out that Peter's proposal lacks this technical criteria as well.

That being said, I think designing any growth rates on theoretical centralization pressure is not sensible and Peter's proposal rightly excludes it and attempts instead for a very gradual increase over time attempting to match blocksize growth that is consistent with technological bandwidth growth.  The problem is that it ignores the last 6 years (we are already "behind") and underestimates bandwidth and storage growth. (See Nielsen's law which states 50% and has held for 30 years).

Peter seems to be of the belief that since bitcoin will never be able to handle all the world's transactions we should instead "...decrease the need for trust required in off-chain systems...".  This is akin to basing all the world's transactions on a small settlement layer, much like balancing a pyramid upside down it will topple.

I'm of the belief that the "reasonable node" test is a simple enough test to maintain decentralization.

A raspberry pie 2 node on reasonable Internet connection with a reasonable hard drive can run a node with 8 or 20mb blocks easily.

As Peter's proposal indicates, "If over time, this growth factor is beyond what the actual technology offers, the intention should be to soft fork a tighter limit."  I wholeheartedly agree, which is why we should plan to be ahead of the curve...not behind it.

On Aug 7, 2015 2:15 PM, "Mark Friedenbach" <mark@friedenbach.org> wrote:
Surely you have some sort of empirical measurement demonstrating the validity of that statement? That is to say you've established some technical criteria by which to determine how much centralization pressure is too much, and shown that Pieter's proposal undercuts expected progress in that area?

On Fri, Aug 7, 2015 at 12:07 PM, Ryan Butler <rryananizer@gmail.com> wrote:

Clarification...

These are not mutually exclusive.  We can design an increase to blocksize that increases available space on chain AND follow technological evolution.  Peter's latest proposal is way too conservative on that front.

And given Peter's assertion that demand is infinite there will still be a an ocean of off chain transactions for the likes of blockstream to address.

On Aug 7, 2015 1:57 PM, "Ryan Butler" <rryananizer@gmail.com> wrote:

Who said anything about scaling bitcoin to visa levels now?  We're talking about an increase now that scales into the future at a rate that is consistent with technological progress.

Peter himself said "So, I think the block size should follow technological evolution...".

The blocksize increase proposals have been modeled around this very thing.  It's reasonable to increase the blocksize to a point that a reasonable person, with reasonable equipment and internet access can run a node or even a miner with acceptable orphan rates.  Most miners are spv mining anyways.  The 8 or even 20 MB limits are within those parameters.

These are not mutually exclusive.  We can design an increase to blocksize that addresses both demand exceeding the available space AND follow technological evolution.  Peter's latest proposal is way too conservative on that front.

On Aug 7, 2015 1:25 PM, "Mark Friedenbach" <mark@friedenbach.org> wrote:
Please don't put words into Pieter's mouth. I guarantee you everyone working on Bitcoin in their heart of hearts would prefer everyone in the world being able to use the Bitcoin ledger for whatever purpose, if there were no cost.

But like any real world engineering issue, this is a matter of tradeoffs. At the extreme it is simply impossible to scale Bitcoin to the terrabyte sized blocks that would be necessary to service the entire world's financial transactions. Not without sacrificing entirely the protection of policy neutrality achieved through decentralization. And as that is Bitcoin's only advantage over traditional consensus systems, you would have to wonder what the point of such an endeavor would be.

So *somewhere* you have to draw the line, and transactions below that level are simply pushed into higher level or off-chain protocols.

The issue, as Pieter and Jorge have been pointing out, is that technical discussion over where that line should be has been missing from this debate.

On Fri, Aug 7, 2015 at 10:47 AM, Ryan Butler via bitcoin-dev <bitcoin-dev@lists.linuxfoundation.org> wrote:

Interesting position there Peter...you fear more people actually using bitcoin.  The less on chain transactions the lower the velocity and the lower the value of the network.  I would be careful what you ask for because you end up having nothing left to even root the security of these off chain transactions with and then neither will exist.

Nobody ever said you wouldn't run out of capacity at any size.  It's quite the fallacy to draw the conclusion from that statement that block size should remain far below a capacity it can easily maintain which would bring more users/velocity/value to the system.  The outcomes of both of those scenarios are asymmetric.  A higher block size can support more users and volume.

Raising the blocksize isn't out of fear.  It's the realization that we are at a point where we can raise it and support more users and transactions while keeping the downsides to a minimum (centralization etc).

On Aug 7, 2015 11:28 AM, "Pieter Wuille via bitcoin-dev" <bitcoin-dev@lists.linuxfoundation.org> wrote:
On Fri, Aug 7, 2015 at 5:55 PM, Gavin Andresen <gavinandresen@gmail.com> wrote:
On Fri, Aug 7, 2015 at 11:16 AM, Pieter Wuille <pieter.wuille@gmail.com> wrote:
I guess my question (and perhaps that's what Jorge is after): do you feel that blocks should be increased in response to (or for fear of) such a scenario.

I think there are multiple reasons to raise the maximum block size, and yes, fear of Bad Things Happening as we run up against the 1MB limit is one of the reasons.

I take the opinion of smart engineers who actually do resource planning and have seen what happens when networks run out of capacity very seriously.

This is a fundamental disagreement then. I believe that the demand is infinite if you don't set a fee minimum (and I don't think we should), and it just takes time for the market to find a way to fill whatever is available - the rest goes into off-chain systems anyway. You will run out of capacity at any size, and acting out of fear of that reality does not improve the system. Whatever size blocks are actually produced, I believe the result will either be something people consider too small to be competitive ("you mean Bitcoin can only do 24 transactions per second?" sounds almost the same as "you mean Bitcoin can only do 3 transactions per second?"), or something that is very centralized in practice, and likely both.


And if so, if that is a reason for increase now, won't it be a reason for an increase later as well? It is my impression that your answer is yes, that this is why you want to increase the block size quickly and significantly, but correct me if I'm wrong.

Sure, it might be a reason for an increase later. Here's my message to in-the-future Bitcoin engineers:  you should consider raising the maximum block size if needed and you think the benefits of doing so (like increased adoption or lower transaction fees or increased reliability) outweigh the costs (like higher operating costs for full-nodes or the disruption caused by ANY consensus rule change).

In general that sounds reasonable, but it's a dangerous precedent to make technical decisions based on a fear of change of economics...

--
Pieter


_______________________________________________
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


_______________________________________________
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev