Skip to content

Instantly share code, notes, and snippets.

@moreal
Created April 20, 2020 17:34
Show Gist options
  • Save moreal/cc3b678551afa37476c227c41d329ccc to your computer and use it in GitHub Desktop.
Save moreal/cc3b678551afa37476c227c41d329ccc to your computer and use it in GitHub Desktop.
It is a benchmark to find what fast way when merging two dictionaries, with multiple methods.
$ pytest --benchmark-compare
/Users/moreal/.pyenv/versions/3.7.4/envs/my-ground/lib/python3.7/site-packages/pytest_benchmark/logger.py:44: PytestBenchmarkWarning: Can't compare. No benchmark files in '/private/tmp/test/pytest/.benchmarks'. Can't load the previous benchmark.
warner(PytestBenchmarkWarning(text))
==================================================================================== test session starts ====================================================================================
platform darwin -- Python 3.7.4, pytest-5.4.1, py-1.8.1, pluggy-0.13.1
benchmark: 3.2.3 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /private/tmp/test/pytest
plugins: benchmark-3.2.3
collected 3 items
tests/test_dict.py ... [100%]
-------------------------------------------------------------------------------------- benchmark: 3 tests --------------------------------------------------------------------------------------
Name (time in ms) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
test_benchmark_update 30.3878 (1.0) 31.2581 (1.0) 30.6925 (1.0) 0.2579 (1.0) 30.6283 (1.0) 0.3767 (1.0) 10;0 32.5813 (1.0) 31 1
test_benchmark_unpack_repack 49.9390 (1.64) 51.6635 (1.65) 50.4087 (1.64) 0.4511 (1.75) 50.3805 (1.64) 0.5255 (1.40) 2;1 19.8379 (0.61) 13 1
test_construct_double_update 78.9237 (2.60) 81.4208 (2.60) 79.9099 (2.60) 0.7377 (2.86) 79.8507 (2.61) 0.7743 (2.06) 4;1 12.5141 (0.38) 12 1
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Legend:
Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.
OPS: Operations Per Second, computed as 1 / Mean
===================================================================================== 3 passed in 4.06s =====================================================================================
dict_samples = [({'x': i}, {'y': i+1}) for i in range(100000)]
def unpack_repack(xd, yd):
return {
**xd,
**yd,
}
def repeat_unpack_repack():
return [unpack_repack(*ss) for ss in dict_samples.copy()]
def update(xd, yd):
xd.update(yd)
return xd
def repeat_update():
return [update(*ss) for ss in dict_samples.copy()]
def construct_double_update(xd, yd):
d = {}
d.update(xd)
d.update(yd)
return d
def repeat_construct_double_update():
return [construct_double_update(*ss) for ss in dict_samples.copy()]
def test_benchmark_unpack_repack(benchmark):
benchmark(repeat_unpack_repack)
def test_benchmark_update(benchmark):
benchmark(repeat_update)
def test_construct_double_update(benchmark):
benchmark(repeat_construct_double_update)
@moreal
Copy link
Author

moreal commented Apr 20, 2020

In conclusion, unpack_repack should be the fastest, because it has a instruction, 'BUILD_MAP_UNPACK'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment