Gavin,
They are not analogous.

Increasing performance and making other changes that will help allow scaling can be done while at small scale or large scale.
Dealing with full blocks and the resultant feedback effects is something that can only be done when blocks are full.  It's just too complicated a problem to solve without seeing the effects first hand, and unlike the block size/scaling concerns, its binary, you're either in the situation where demands outgrows supply or you aren't.  

Fee estimation is one example, I tried very hard to make fee estimation work well when blocks started filling up but it was impossible to truly test and in the small sample of full blocks we've gotten since the code went live, many improvements made themselves obvious.  Expanding mempools is another issue that doesn't exist at all if supply > demand.   Turns out to also be a difficult problem to solve.

Nevertheless, I mostly agree that these arguments shouldn't be the reason not to expand block size, I think they are more just an example of how immature all of this technology is, and we should be concentrating on improving it before we're trying to scale it to world acceptance levels.  The saddest thing about this whole debate is how fundamental improvements to the science of cryptocurrencies (things like segregated witness and confidential transactions) are just getting lost in the circus debate around trying to cram a few more users into the existing system sooner rather than later.



On Mon, Aug 10, 2015 at 10:12 AM, Gavin Andresen via bitcoin-dev <bitcoin-dev@lists.linuxfoundation.org> wrote:
On Fri, Aug 7, 2015 at 1:33 PM, Jorge Timón <jtimon@jtimon.cc> wrote:


On Aug 7, 2015 5:55 PM, "Gavin Andresen" <gavinandresen@gmail.com> wrote:
>
> I think there are multiple reasons to raise the maximum block size, and yes, fear of Bad Things Happening as we run up against the 1MB limit is one of the reasons.

What are the other reasons?

> I take the opinion of smart engineers who actually do resource planning and have seen what happens when networks run out of capacity very seriously.

When "the network runs out of capacity" (when we hit the limit) do we expect anything to happen apart from minimum market fees rising (above zero)?
Obviously any consequences of fees rising are included in this concern.

It is frustrating to answer questions that we answered months ago, especially when I linked to these in response to your recent "increase advocates say that not increasing the max block size will KILL BITCOIN" false claim:

Executive summary: when networks get over-saturated, they become unreliable.  Unreliable is bad.

Unreliable and expensive is extra bad, and that's where we're headed without an increase to the max block size.

RE: the recent thread about "better deal with that type of thing now rather than later" :  exactly the same argument can be made about changes needed to support a larger block size-- "better to do that now than to do that later."  I don't think either of those arguments are very convincing.


--
--
Gavin Andresen


_______________________________________________
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev