Compression on SAFEnet

I’ve read that SAFEnet will use compression but still haven’t found what library will be used and if its been already implemented, I was looking at zstd which looks pretty nice:

and also:

have you considered any of those libraries?

1 Like

Safe use Brotli.


At Google, we think that internet users’ time is valuable, and that they shouldn’t have to wait long for a web page to load. Because fast is better than slow, two years ago we published the Zopfli compression algorithm. This received such positive feedback in the industry that it has been integrated into many compression solutions, ranging from PNG optimizers to preprocessing web content. Based on its use and other modern compression needs, such as web font compression, today we are excited to announce that we have developed and open sourced a new algorithm, the Brotli compression algorithm.
While Zopfli is Deflate-compatible, Brotli is a whole new data format. This new format allows us to get 20–26% higher compression ratios over Zopfli. In our study ‘Comparison of Brotli, Deflate, Zopfli, LZMA, LZHAM and Bzip2 Compression Algorithms’ we show that Brotli is roughly as fast as zlib’s Deflate implementation. At the same time, it compresses slightly more densely than LZMA and bzip2 on the Canterbury corpus. The higher data density is achieved by a 2nd order context modeling, re-use of entropy codes, larger memory window of past data and joint distribution codes. Just like Zopfli, the new algorithm is named after Swiss bakery products. Brötli means ‘small bread’ in Swiss German.
The smaller compressed size allows for better space utilization and faster page loads. We hope that this format will be supported by major browsers in the near future, as the smaller compressed size would give additional benefits to mobile users, such as lower data transfer fees and reduced battery use.
By Zoltan Szabadka, Software Engineer, Compression Team

Unlike most general purpose compression algorithms, Brotli uses a pre-defined 120 kilobyte dictionary. The dictionary contains over 13000 common words, phrases and other substrings derived from a large corpus of text and HTML documents.[6][7] A pre-defined algorithm can give a compression density boost for short data files.

I think this is where the magic happens in a browser. In HTML and stuff there’s a lot of script that’s used over and over again.


Dropbox released a new implementation of brotli in rust that is ~10x faster than the one used by SAFEnet, I think the developers should switch to this one:

What do you guys think?


Generally speaking most modern compression algorithms give roughly the same compression, and with regard to the number of cores that you can use at once, it is up to you to decide how many you want to use. Generally speaking (unless you are creating large archives) there is no reason to need more than one though. In addition, with multiple cores doing the compression, the bottleneck may become the hard drive. Legacy zip compression is akin to the Deflate method in 7-zip, and will offer the most compatibility between different compression software.


I’m looking at the self encryption diagram on the wiki, is Brotli used for the “Optional Compress” step described there?
If so, does this mean it’s not optional anymore, or there are only some scenarios when it’s indeed used to compress the chunks?

YEs it is and it is always used in self encryption, it adds a little bit more confusion to the process. Hope that helps.