Hi Aleksey,

Yes, BIP158 uses the block hash to seed the hash function, which makes distinct block filters non-aggregatable 
for common values. Aggregate fiters on ranges of blocks would have to use some other seed and then 
achive significant savings using the same design.

I think that the most likely use of filters is to decide if a newly announced block should be downloaded and 
not scanning over the entire chain, where aggregate filters would help. I also suspect that whole chain 
scans would be better served with plain sequential reads in map-reduce style.

Typical clients do not care of filters for blocks before the birth date of their wallet’s keys, so they skip over the 
majority of history which is a bigger saving than any aggregate filter.

I wish we get a filter committed as commitment would unlock more utility than any marginal savings through
more elaborate design.

Tamas Blummer

On Sep 19, 2019, at 19:20, admin--- via bitcoin-dev <bitcoin-dev@lists.linuxfoundation.org> wrote:

Hello list, 

Here is a link for a draft of a BIP for  compact probabilistic block filters alternative of BIP 158


Summary:

 - BIP 158  false positive rate is low, we can achieve lower bandwidth with higher false positive rate filter while sync blockchain

 - BIP 158 not do not support filter batching by design of used parameters for siphash and Golomb coding optimal parameters

 - Alternative compression with delta coding and splitting data to 2 bit string  sequences. First for data without prefixes, second one for information about  bit length written to first sequence.
   Second sequence have a lot of duplicates,  compressed with 2 round of Huffman algorithm. (Effectivity about 98% vs Golomb with optimal parameters)

 - Block filters batching reduce filter size significantly

- Separation of filters by address type allows lite client not to download redundant information without compromising privacy.

- Lite client filters download strategy: get biggest filter (smallest blocks/size rate) for blocks range, in case positive test  -> get medium filters to reduce blocks range ->  get block filters for affected range -> download affected blocks over TOR 


Exactly information from mainnet  about size for separated filters by address types and batch size will be added within few days.

Thanks for any feedback.
      Aleksey Karpov

_______________________________________________
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev