Why Nostr? What is Njump?
2023-06-07 18:20:41
in reply to

Tamas Blummer [ARCHIVE] on Nostr: 📅 Original date posted:2019-09-21 📝 Original message:Hi Aleksey, Yes, BIP158 ...

📅 Original date posted:2019-09-21
📝 Original message:Hi Aleksey,

Yes, BIP158 uses the block hash to seed the hash function, which makes distinct block filters non-aggregatable
for common values. Aggregate fiters on ranges of blocks would have to use some other seed and then
achive significant savings using the same design.

I think that the most likely use of filters is to decide if a newly announced block should be downloaded and
not scanning over the entire chain, where aggregate filters would help. I also suspect that whole chain
scans would be better served with plain sequential reads in map-reduce style.

Typical clients do not care of filters for blocks before the birth date of their wallet’s keys, so they skip over the
majority of history which is a bigger saving than any aggregate filter.

I wish we get a filter committed as commitment would unlock more utility than any marginal savings through
more elaborate design.

Tamas Blummer

> On Sep 19, 2019, at 19:20, admin--- via bitcoin-dev <bitcoin-dev at lists.linuxfoundation.org> wrote:
>
> Hello list,
>
> Here is a link for a draft of a BIP for compact probabilistic block filters alternative of BIP 158
>
> https://docs.google.com/document/d/1jH9tEUyb9w2OZd4-kxfGuyNIIZzmgkEb_z0qSxv80ik/edit?usp=sharing <https://docs.google.com/document/d/1jH9tEUyb9w2OZd4-kxfGuyNIIZzmgkEb_z0qSxv80ik/edit?usp=sharing>;
>
> Summary:
>
> - BIP 158 false positive rate is low, we can achieve lower bandwidth with higher false positive rate filter while sync blockchain
>
> - BIP 158 not do not support filter batching by design of used parameters for siphash and Golomb coding optimal parameters
>
> - Alternative compression with delta coding and splitting data to 2 bit string sequences. First for data without prefixes, second one for information about bit length written to first sequence.
> Second sequence have a lot of duplicates, compressed with 2 round of Huffman algorithm. (Effectivity about 98% vs Golomb with optimal parameters)
>
> - Block filters batching reduce filter size significantly
>
> - Separation of filters by address type allows lite client not to download redundant information without compromising privacy.
>
> - Lite client filters download strategy: get biggest filter (smallest blocks/size rate) for blocks range, in case positive test -> get medium filters to reduce blocks range -> get block filters for affected range -> download affected blocks over TOR
>
> Implementation (python): https://github.com/bitaps-com/pybtc/blob/bugfix/pybtc/functions/filters.py#L172 <https://github.com/bitaps-com/pybtc/blob/bugfix/pybtc/functions/filters.py#L172>;
>
> Exactly information from mainnet about size for separated filters by address types and batch size will be added within few days.
>
> Thanks for any feedback.
> Aleksey Karpov
>
> _______________________________________________
> bitcoin-dev mailing list
> bitcoin-dev at lists.linuxfoundation.org
> https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20190921/ecea43cc/attachment.html>;
Author Public Key
npub1ccegg9n9lnx6huppxg43m95488yur7pfemkn3pz0agjws5ffvtts0ex8m8