From 2ecdf3169255eb9b7a5c022b3887e3105a2c1904 Mon Sep 17 00:00:00 2001 From: Felmon Fekadu Date: Wed, 25 Mar 2026 16:52:10 -0600 Subject: [PATCH 1/3] docs: document benchmark_min_time CLI forms --- AUTHORS | 1 + CONTRIBUTORS | 1 + docs/user_guide.md | 22 ++++++++++++++++++++++ 3 files changed, 24 insertions(+) diff --git a/AUTHORS b/AUTHORS index f3f29d6964..bea19b3056 100644 --- a/AUTHORS +++ b/AUTHORS @@ -32,6 +32,7 @@ Evgeny Safronov Fabien Pichot Federico Ficarelli Felix Homann +Felmon Fekadu Gergely Meszaros Gergő Szitár Google Inc. diff --git a/CONTRIBUTORS b/CONTRIBUTORS index a88240c60b..c3f6789aff 100644 --- a/CONTRIBUTORS +++ b/CONTRIBUTORS @@ -52,6 +52,7 @@ Fabien Pichot Fanbo Meng Federico Ficarelli Felix Homann +Felmon Fekadu Geoffrey Martin-Noble Gergely Meszaros Gergő Szitár diff --git a/docs/user_guide.md b/docs/user_guide.md index b2e6975361..d29463685f 100644 --- a/docs/user_guide.md +++ b/docs/user_guide.md @@ -252,6 +252,28 @@ iterations is at least one, not more than 1e9, until CPU time is greater than the minimum time, or the wallclock time is 5x minimum time. The minimum time is set per benchmark by calling `MinTime` on the registered benchmark object. +The minimum time can also be set for all benchmarks with the +`--benchmark_min_time=` command-line option. This flag supports two +forms: + +* `--benchmark_min_time=s` sets the minimum running time for each + benchmark repetition in seconds. +* `--benchmark_min_time=x` runs each benchmark repetition for an + explicit number of iterations instead of using the dynamic time-based + iteration selection. This applies to benchmarks that do not already specify + an explicit iteration count in code. + +For compatibility, bare numeric values such as `--benchmark_min_time=0.5` are +also interpreted as seconds, but the explicit `s` suffix is preferred for +clarity. + +For example: + +```bash +$ ./run_benchmarks.x --benchmark_min_time=0.5s +$ ./run_benchmarks.x --benchmark_min_time=100x +``` + Furthermore warming up a benchmark might be necessary in order to get stable results because of e.g caching effects of the code under benchmark. Warming up means running the benchmark a given amount of time, before From c40794da647ffcc7bd0e807e6475a4f53ae16c9e Mon Sep 17 00:00:00 2001 From: Felmon Fekadu Date: Wed, 25 Mar 2026 17:09:16 -0600 Subject: [PATCH 2/3] chore: trigger CLA rescan From c1bfc2bb2a964bfeeaf0fb2b6b4e5f5c286f3eb2 Mon Sep 17 00:00:00 2001 From: Felmon Fekadu Date: Wed, 25 Mar 2026 21:09:45 -0600 Subject: [PATCH 3/3] docs: clarify benchmark_min_time precedence --- docs/user_guide.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/docs/user_guide.md b/docs/user_guide.md index d29463685f..c09d77554f 100644 --- a/docs/user_guide.md +++ b/docs/user_guide.md @@ -274,6 +274,10 @@ $ ./run_benchmarks.x --benchmark_min_time=0.5s $ ./run_benchmarks.x --benchmark_min_time=100x ``` +If a benchmark specifies its own `MinTime()` or `Iterations()` in code, those +per-benchmark settings take precedence over the corresponding +`--benchmark_min_time` command-line forms. + Furthermore warming up a benchmark might be necessary in order to get stable results because of e.g caching effects of the code under benchmark. Warming up means running the benchmark a given amount of time, before