public inbox for bitcoindev@googlegroups.com
 help / color / mirror / Atom feed
From: Steve <shadders.del@gmail•com>
To: bitcoin-development@lists•sourceforge.net
Subject: [Bitcoin-development] Building a node crawler to map network
Date: Tue, 06 Sep 2011 17:42:30 +1000	[thread overview]
Message-ID: <4E65CEE6.7030002@gmail.com> (raw)

Hi All,

I started messing around today with building a node crawler to try and 
map out the bitcoin network and hopefully provide some useful 
statistics.  It's very basic so far using a mutilated bitcoinj to 
connect (due me being java developer and not having a clue with c/c++). 
  If it's worthwhile I'll hack bitcoinj some more to run on top Netty to 
take advantage of it's NIO architecture (netty's been shown to handle 
1/2 million concurrent connections so would be ideal for the purpose).

Hoping to a get a bit of input into what would be useful as well as 
strategy for getting max possible connections without distorted data.  I 
seem to recall Gavin talking about the need for some kind of network 
health monitoring so I assume there's a need for something like this...

Firstly at the moment basically I'm just storing version message and the 
results of getaddr for each node that I can connect to.  Is there any 
other useful info that can be extracted from a node that's worth collecting?

Second and main issue is how to connect.  From my first very basic 
probing it seems the very vast majority of nodes don't accept incoming 
connections no doubt due to lack of upnp.  So it seems the active crawl 
approach is not really ideal for the purpose.  Even if it was used the 
resultant data would be hopelessly distorted.

A honeypot approach would probably be better if there was some way to 
make a node 'attractive' to other nodes to connect to.  That way it 
could capture non-listening nodes as well.  If there is some way to 
influence other nodes to connect to the crawler node that solves the 
problem.  If there isn't which I suspect is the case then perhaps 
another approach is to build an easy to deploy crawler node that many 
volunteers could run and that could then upload collected data to a 
central repository.

While I'm asking questions I'll add one more regarding the getaddr 
message.  It seems most nodes return about 1000 addresses in response to 
this message.  Obviously most of these nodes haven't actually talked to 
all 1000 on the list so where does this list come from?  Is it mixture 
of addresses obtained from other nodes somehow sorted by timestamp? 
Does it include some nodes discovered by IRC/DNS? Or are those only used 
to find the first nodes to connect to?

Thanks for any input... Hopefully I can build something that's useful 
for the network...



             reply	other threads:[~2011-09-06  7:42 UTC|newest]

Thread overview: 10+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2011-09-06  7:42 Steve [this message]
2011-09-06  8:29 ` Steve
2011-09-06  8:36   ` Christian Decker
2011-09-06 12:49     ` Mike Hearn
2011-09-06 13:27       ` Steve
2011-09-06 13:31         ` Mike Hearn
2011-09-06 14:17           ` Steve
2011-09-06 14:52             ` Mike Hearn
2011-09-06 15:25               ` Steve
2011-09-06 14:36 ` Rick Wesson

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=4E65CEE6.7030002@gmail.com \
    --to=shadders.del@gmail$(echo .)com \
    --cc=bitcoin-development@lists$(echo .)sourceforge.net \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox