| 1 | largeNbDicts |
| 2 | ===================== |
| 3 | |
| 4 | `largeNbDicts` is a benchmark test tool |
| 5 | dedicated to the specific scenario of |
| 6 | dictionary decompression using a very large number of dictionaries. |
| 7 | When dictionaries are constantly changing, they are always "cold", |
| 8 | suffering from increased latency due to cache misses. |
| 9 | |
| 10 | The tool is created in a bid to investigate performance for this scenario, |
| 11 | and experiment mitigation techniques. |
| 12 | |
| 13 | Command line : |
| 14 | ``` |
| 15 | largeNbDicts [Options] filename(s) |
| 16 | |
| 17 | Options : |
| 18 | -z : benchmark compression (default) |
| 19 | -d : benchmark decompression |
| 20 | -r : recursively load all files in subdirectories (default: off) |
| 21 | -B# : split input into blocks of size # (default: no split) |
| 22 | -# : use compression level # (default: 3) |
| 23 | -D # : use # as a dictionary (default: create one) |
| 24 | -i# : nb benchmark rounds (default: 6) |
| 25 | --nbBlocks=#: use # blocks for bench (default: one per file) |
| 26 | --nbDicts=# : create # dictionaries for bench (default: one per block) |
| 27 | -h : help (this text) |
| 28 | |
| 29 | Advanced Options (see zstd.h for documentation) : |
| 30 | --dedicated-dict-search |
| 31 | --dict-content-type=# |
| 32 | --dict-attach-pref=# |
| 33 | ``` |