Can you please elaborate on what terrible things will happen if we
don't increase the block size by winter this year?

I was referring to winter next year. 0.12 isn't scheduled until the end of the year, according to Wladimir. I explained where this figure comes from in this article:

https://medium.com/@octskyward/bitcoin-s-seasonal-affective-disorder-35733bab760d

It's a fairly simple estimate based on previous growth patterns.

Because I love wild guesses and mine is that full 1 MB blocks will not
happen until June 2017.

OK, it could be. But do you think this debate will play out significantly differently if you are right, I am wrong, and we have this discussion next summer instead? Because in several years of watching these debates, I haven't seen much change in them.
 
We've successfully reached consensus for several softfork proposals already.

Are you sure about that?

What if Gavin popped up right now and said he disagreed with every current proposal, he disagreed with side chains too, and there would be no consensus on any of them until the block size limit was raised.

Would you say, oh, OK, guess that's it then. There's no consensus so might as well scrap all those proposals, as they'll never happen anyway. Bye bye side chains whitepaper.

 
I just hope that by  "What we need to see right now is leadership" you
don't mean something like "when Gaving and Mike agree it's enough to
deploy a hardfork" when you go from vague to concrete.

No. What I meant is that someone (theoretically Wladimir) needs to make a clear decision. If that decision is "Bitcoin Core will wait and watch the fireworks when blocks get full", that would be showing leadership ..... albeit I believe in the wrong direction. It would, however, let people know what's what and let them start to make longer term plans.

This dillydallying around is an issue - people just make vague points that can't really be disagreed with (more nodes would be nice, smaller pools would also be nice etc), and nothing gets done.
 
"no bitcoin long term it's broken long term but that's far away in the
future so let's just worry about the present".

I never said Bitcoin is broken in the long term. Far from it - I laid out my ideas for what will happen when the block subsidy dwindles years ago.

But yes, it's hard for me to care overly much about what happens 30 years from now, for the same reason you probably care more about what happens tomorrow than what happens after you are dead. The further into the future you try and plan, the less likely your plans are to survive unscathed.
 
What you want to avoid at all cost (the block size actually being
used), I see as the best opportunity we have to look into the future.

I think I see one of the causes of disagreement now.

I will write more on the topic of what will happen if we hit the block size limit soon, maybe this evening. I have some other tasks to do first.

Regardless, I don't believe we will get any useful data out of such an event. I've seen distributed systems run out of capacity before. What will happen instead is technological failure followed by rapid user abandonment that pushes traffic back below the pressure threshold .... and those users will most likely not come back any time soon.
 
Ok, this is my plan: we wait 12 months, hope that your estimations are
correct (in case that my guess was better than yours, we keep waiting
until June 2017) and start having full blocks and people having to
wait 2 blocks for their transactions to be confirmed some times.

I disagree that'd be the outcome, but good, this is progress. Now we need to hear something like that from Wladimir, or whoever has the final say around here.
 
With respect to the fee market: I think it's fairer to say Gavin wants a market to exist, and he also wants supply to be plentiful. 20mb limit doesn't actually mean every block will be 20mb the day after, no more than they're all 1mb today. Miners may discover that if they go beyond 5mb they have too many orphans and then propagation speed will have to be optimised to break through the next bottleneck. Scaling is always about finding the next bottleneck and removing it, ideally, before you hit it.