public inbox for bitcoindev@googlegroups.com
 help / color / mirror / Atom feed
* [Bitcoin-development] Proposal: A measured response to save Bitcoin Core
@ 2015-05-31  0:29 Matt Whitlock
  2015-05-31  9:32 ` s7r
  2015-05-31  9:35 ` Btc Drak
  0 siblings, 2 replies; 4+ messages in thread
From: Matt Whitlock @ 2015-05-31  0:29 UTC (permalink / raw)
  To: Gregory Maxwell, Pieter Wuille, Jeff Garzik, Wladimir J. van der Laan
  Cc: bitcoin-development

Greg, Pieter, Jeff, and Wladimir,

I'll try to be brief to respect your time.

1. I don't want to see Bitcoin die.

2. As has been discussed on this list and elsewhere: Bitcoin could potentially die due to economic and/or game-theoretic complications arising from raising the block size limit, but Bitcoin could also die due to usability complications arising from NOT raising the block size limit. Strong, personally held opinions by various members of this community notwithstanding, it is not clear which of these scenarios is more likely.

3. What *is* clear at this point is that Gavin will move ahead with his proposal, regardless of whether the remainder of the Bitcoin Core committers agree with him. If he has to commit his changes to Bitcoin XT and then rally the miners to switch, then that's what he'll do. He believes that he is working in the best interests of Bitcoin (as I would hope we all do), and so I do not fault him for his intentions. However, I think his proposal is too risky.

4. I also think that ignoring the immediate problem is too risky. If allowing significantly larger blocks will cause a serious problem for Bitcoin (which is a possibility that we cannot rule out, as we lack omniscience), then NOT making any change to Bitcoin Core will virtually *assure* that we cause exactly this problem, as the popular (non-technical) consensus appears to be in favor of Bitcoin XT and a larger block size limit. If we do nothing, then there's a very real chance that Bitcoin XT takes over, for better or worse.

5. I'd like to propose a way that we can have our cake and eat it too. My proposal attempts to satisfy both those who want larger blocks AND those who want to be extremely cautious about changing the fundamental economic parameters of Bitcoin.

6. Something I've never understood about Gavin's (et al.) proposal is why there is a massive step right up front. Assuming we accept his argument that we're critically close to running out of capacity, I still must ask: why do we need a 20x increase all at once?

7. It's not a given that blocks will immediately expand to meet the hard limit. In fact, there are strong and compelling arguments why this will NOT happen. But in any software system, if a given scenario is *possible*, then one MUST assume that it will happen and must have a plan to handle it.

8. My primary objection is not to raising the block size limit; my objection is to raising it *suddenly*. You can argue that, because we'll have plenty of time before March 2016, it's not "sudden," but, whether we do it now or a year from now or a decade from now, a step function is, by definition, sudden.

9. My proposal is that we raise the block size limit *gradually*, using an approximately smooth function, without a step discontinuity. We can employ a linear growth function to adjust the block size limit *smoothly* from 1 MB to 20 MB over the course of several years, beginning next March.

10. This is the difference between cannonballing into the deep end of the pool and walking gingerly down the steps into the shallow end. Both get you to the eventual goal, but one is reckless while the other is measured and deliberate. If there's a problem that larger blocks will enable, then I'd prefer to see the problem crop up gradually rather than all at once. If it's gradual, then we'll have time to discuss and fix it without panicking.

11. I am offering to implement this proposal and submit a pull request to Bitcoin Core. However, if another dev who is more familiar with the internals would like to step forward, then that would be superior.

Respectfully submitted,
Matt Whitlock



^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: [Bitcoin-development] Proposal: A measured response to save Bitcoin Core
  2015-05-31  0:29 [Bitcoin-development] Proposal: A measured response to save Bitcoin Core Matt Whitlock
@ 2015-05-31  9:32 ` s7r
  2015-05-31  9:35 ` Btc Drak
  1 sibling, 0 replies; 4+ messages in thread
From: s7r @ 2015-05-31  9:32 UTC (permalink / raw)
  To: Matt Whitlock, Gregory Maxwell, Pieter Wuille, Jeff Garzik,
	Wladimir J. van der Laan, Peter Todd
  Cc: bitcoin-development

Hi,

For the less crypto engineering experts but highly interested in Bitcoin
and working with Bitcoin on daily basis reading the list, what would be
an easy to understand explanation about how does this solution represent
a good fix?

So, we have a hard cap of 1 MB block currently. This is not enough
because more and more people use Bitcoin and the transaction volume
increased (yeey, good news). So, rather than fixing the issue for good,
we just increase the block size hard cap to 20 MB. I will not discuss if
this causes problems or not. But what are the future plans, when the 20
MB hard cap will be reached? Increase it again? This doesn't sound like
a fix, it sounds more like pushing the can down the road. Obviously if 1
MB is not enough now, we have the reasonable suspicion that 20 MB could
not be enough in few years.

What is the explanation that 20 MB blocks will be sufficient for life
time? Is it because 'probably other solutions will appear, such as
micropayment channels and offchain transactions'?  If this is the case,
those can easily function with 1 MB blocks as well, and we should see
those in action sooner rather than later.

I run multiple full nodes, including one with Bitcoin XT and I don't
want to see Bitcoin XT and Bitcoin Core divide into different consensus
and create 2 altcoins instead of one Bitcoin.

On 5/31/2015 3:29 AM, Matt Whitlock wrote:
> Greg, Pieter, Jeff, and Wladimir,
> 
> I'll try to be brief to respect your time.
> 
> 1. I don't want to see Bitcoin die.
> 
> 2. As has been discussed on this list and elsewhere: Bitcoin could potentially die due to economic and/or game-theoretic complications arising from raising the block size limit, but Bitcoin could also die due to usability complications arising from NOT raising the block size limit. Strong, personally held opinions by various members of this community notwithstanding, it is not clear which of these scenarios is more likely.
> 
> 3. What *is* clear at this point is that Gavin will move ahead with his proposal, regardless of whether the remainder of the Bitcoin Core committers agree with him. If he has to commit his changes to Bitcoin XT and then rally the miners to switch, then that's what he'll do. He believes that he is working in the best interests of Bitcoin (as I would hope we all do), and so I do not fault him for his intentions. However, I think his proposal is too risky.
> 
> 4. I also think that ignoring the immediate problem is too risky. If allowing significantly larger blocks will cause a serious problem for Bitcoin (which is a possibility that we cannot rule out, as we lack omniscience), then NOT making any change to Bitcoin Core will virtually *assure* that we cause exactly this problem, as the popular (non-technical) consensus appears to be in favor of Bitcoin XT and a larger block size limit. If we do nothing, then there's a very real chance that Bitcoin XT takes over, for better or worse.
> 
> 5. I'd like to propose a way that we can have our cake and eat it too. My proposal attempts to satisfy both those who want larger blocks AND those who want to be extremely cautious about changing the fundamental economic parameters of Bitcoin.
> 
> 6. Something I've never understood about Gavin's (et al.) proposal is why there is a massive step right up front. Assuming we accept his argument that we're critically close to running out of capacity, I still must ask: why do we need a 20x increase all at once?
> 
> 7. It's not a given that blocks will immediately expand to meet the hard limit. In fact, there are strong and compelling arguments why this will NOT happen. But in any software system, if a given scenario is *possible*, then one MUST assume that it will happen and must have a plan to handle it.
> 
> 8. My primary objection is not to raising the block size limit; my objection is to raising it *suddenly*. You can argue that, because we'll have plenty of time before March 2016, it's not "sudden," but, whether we do it now or a year from now or a decade from now, a step function is, by definition, sudden.
> 
> 9. My proposal is that we raise the block size limit *gradually*, using an approximately smooth function, without a step discontinuity. We can employ a linear growth function to adjust the block size limit *smoothly* from 1 MB to 20 MB over the course of several years, beginning next March.
> 
> 10. This is the difference between cannonballing into the deep end of the pool and walking gingerly down the steps into the shallow end. Both get you to the eventual goal, but one is reckless while the other is measured and deliberate. If there's a problem that larger blocks will enable, then I'd prefer to see the problem crop up gradually rather than all at once. If it's gradual, then we'll have time to discuss and fix it without panicking.
> 
> 11. I am offering to implement this proposal and submit a pull request to Bitcoin Core. However, if another dev who is more familiar with the internals would like to step forward, then that would be superior.
> 
> Respectfully submitted,
> Matt Whitlock
> 
> ------------------------------------------------------------------------------
> _______________________________________________
> Bitcoin-development mailing list
> Bitcoin-development@lists•sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bitcoin-development
> 



^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: [Bitcoin-development] Proposal: A measured response to save Bitcoin Core
  2015-05-31  0:29 [Bitcoin-development] Proposal: A measured response to save Bitcoin Core Matt Whitlock
  2015-05-31  9:32 ` s7r
@ 2015-05-31  9:35 ` Btc Drak
  2015-05-31 10:01   ` Eric Lombrozo
  1 sibling, 1 reply; 4+ messages in thread
From: Btc Drak @ 2015-05-31  9:35 UTC (permalink / raw)
  To: Matt Whitlock; +Cc: Bitcoin Dev

[-- Attachment #1: Type: text/plain, Size: 4359 bytes --]

On Sun, May 31, 2015 at 1:29 AM, Matt Whitlock <bip@mattwhitlock•name>
wrote:

> 3. What *is* clear at this point is that Gavin will move ahead with his
> proposal, regardless of whether the remainder of the Bitcoin Core
> committers agree with him. If he has to commit his changes to Bitcoin XT
> and then rally the miners to switch, then that's what he'll do. He believes
> that he is working in the best interests of Bitcoin (as I would hope we all
> do), and so I do not fault him for his intentions. However, I think his
> proposal is too risky.
>

I seriously doubt if miners and merchants who's income depends on bitcoin
are going to risk a network split. Gavin isn't pedalling some mempool
policy which doesn't affect consensus. The changes have to be universally
adopted by miners and full nodes. If there is any uncertainty about that
global acceptance, those financially dependent on bitcoin will not take the
risk just to be political. You can see how conservative the mining
community is already by their slow upgrade of Bitcoin Core as it is. Even
if some miners and merchants generally support the idea of bigger blocks,
they most certainly are not going to take the risk of leading a hard fork
when there is substantial risk of it failing.

Until there is actual consensus among the technical community I wouldn't be
too concerned.


> 4. I also think that ignoring the immediate problem is too risky. If
> allowing significantly larger blocks will cause a serious problem for
> Bitcoin (which is a possibility that we cannot rule out, as we lack
> omniscience), then NOT making any change to Bitcoin Core will virtually
> *assure* that we cause exactly this problem, as the popular (non-technical)
> consensus appears to be in favor of Bitcoin XT and a larger block size
> limit. If we do nothing, then there's a very real chance that Bitcoin XT
> takes over, for better or worse.
>

I don't think anyone is ignoring the issues, nor that everyone accepts that
blocksize may have to eventually change. The overwhelming technical
majority do not agree there is a problem that needs to be immediately
addressed. It would be far more helpful if we focused on stuff that helps
enable level 2 technologies so that bitcoin can actually scale, (like
R/CLTV and malleability fixes which are being delayed by BIP66 rollout and
pending the new "concurrent soft-forks" proposal).


> 7. It's not a given that blocks will immediately expand to meet the hard
> limit. In fact, there are strong and compelling arguments why this will NOT
> happen. But in any software system, if a given scenario is *possible*, then
> one MUST assume that it will happen and must have a plan to handle it.
>

But of course it would be dealt with if and when it becomes necessary. It's
not like there is blanket opposition to increasing the blocksize ever, it's
the matter of if, when and how; but when is defintely not now.

9. My proposal is that we raise the block size limit *gradually*, using an
> approximately smooth function, without a step discontinuity. We can employ
> a linear growth function to adjust the block size limit *smoothly* from 1
> MB to 20 MB over the course of several years, beginning next March.
>

Automatic or dynamic blocksize increase risks being very difficult to shut
down if later we find it is negatively impacting the ecosystem... and
that's part of the reluctance with bigger blocks because we still have not
studied the potential downsides enough beyond some sketchy and disputed
calculations and overall it's not addressing scalability at all.


> 10. This is the difference between cannonballing into the deep end of the
> pool and walking gingerly down the steps into the shallow end. Both get you
> to the eventual goal, but one is reckless while the other is measured and
> deliberate. If there's a problem that larger blocks will enable, then I'd
> prefer to see the problem crop up gradually rather than all at once. If
> it's gradual, then we'll have time to discuss and fix it without panicking.


Extending blocksize now would be nothing more than a political move. I have
no idea what will be decided in the end, but I do know that in order for
bitcoin to survive, changes must be based on well thought out and discussed
technical merits and not the result of political pressure. Politics and
good software do not mix.

Drak

[-- Attachment #2: Type: text/html, Size: 5676 bytes --]

^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: [Bitcoin-development] Proposal: A measured response to save Bitcoin Core
  2015-05-31  9:35 ` Btc Drak
@ 2015-05-31 10:01   ` Eric Lombrozo
  0 siblings, 0 replies; 4+ messages in thread
From: Eric Lombrozo @ 2015-05-31 10:01 UTC (permalink / raw)
  To: Btc Drak; +Cc: Bitcoin Dev

[-- Attachment #1: Type: text/plain, Size: 1482 bytes --]

Drak,

I mostly agree with your assessment...except for your last claim.

Not that I wouldn't like to find a way to avoid politics, but like I've
argued before, it is inevitable that sooner or later any consensus protocol
that seeks dynamism will encounter politics.

The block size discussion, while ultimately necessary, for now is in the
best case merely serving as an example of the kind of political issues we
*really* need to be finding some solution for...and in the worst case is a
distraction and evasion.

Some protocol updates will be merely technical optimizations or feature
enhancements that are fairly uncontroversial...but some will inevitably be
highly controversial with real-world economic consequences, winners and
losers. We lack a process for deciding these issues. No matter how
sophistocated we make the protocol, somethings will inevitably require
external input to make these issues decidable...it is a Goedelian
implication. This external input could be some sort of vote (of which
hashing power is a particular kind) or perhaps something else.

There's something to be said for building the dynamics of hard forks *into*
our model rather than avoiding it at all costs.  However, forks are the
easy part. The difficulty is in merging different branches. Perhaps we
should learn a thing or two from git. Perhaps the question we should be
asking is not "how do we avoid hard forks" but "how can we design the
network to allow for merging?"

- Eric Lombrozo

[-- Attachment #2: Type: text/html, Size: 1635 bytes --]

^ permalink raw reply	[flat|nested] 4+ messages in thread

end of thread, other threads:[~2015-05-31 10:02 UTC | newest]

Thread overview: 4+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2015-05-31  0:29 [Bitcoin-development] Proposal: A measured response to save Bitcoin Core Matt Whitlock
2015-05-31  9:32 ` s7r
2015-05-31  9:35 ` Btc Drak
2015-05-31 10:01   ` Eric Lombrozo

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox