Liking cljdoc? Tell your friends :D
Clojure only.

utilities-clj.benchmark

Provides benchmarking functionality.

Provides benchmarking functionality.
raw docstring

benchclj

(bench & args)

Run the project's benchmark tests.

To add this task to a project create a "leiningen" directory in the your project root/src. Then you can put your tasks in there, like src/leiningen/your-task-name.clj. Next, add a function named "your-task-name" in the namespace "leiningen.your-task-name", which takes a project argument containing information defined in defproject and command-line arguments. Call this task from "your-task-name" with command-line arguments.

Arguments to this task must be a list of namespaces with benchmark tests and benchmark options. If no namespace is specified, all namespaces in the "benchmarks" directory will be loaded.

Recognized formats of namespace arguments: "namespace-name", "namespace-name/benchmark-test-name" "namespace-name" all benchmark tests in this namespace will be executed, "namespace-name/benchmark-test-name" specified benchmark test in this namespace will be executed.

Recognized benchmark options: :quick :quick forces benchmark tests to be run with criterium.core/quick-benchmark, with no argument benchmark tests are run with criterium.core/benchmark.

Define your benchmark tests using the defbenchmark macro. Put them into the "benchmarks" directory in your project root.

References

[1] https://nakkaya.com/2010/02/25/writing-leiningen-plugins-101/ 
Run the project's benchmark tests.

To add this task to a project create a "leiningen" directory in the your
project root/src. Then you can put your tasks in there, like
src/leiningen/your-task-name.clj. Next, add a function named "your-task-name"
in the namespace "leiningen.your-task-name", which takes a project argument
containing information defined in defproject and command-line arguments.
Call this task from "your-task-name" with command-line arguments.


Arguments to this task must be a list of namespaces with benchmark tests and
benchmark options. If no namespace is specified, all namespaces in the
"benchmarks" directory will be loaded.

Recognized formats of namespace arguments: "namespace-name", "namespace-name/benchmark-test-name"
 "namespace-name"
   all benchmark tests in this namespace will be executed,
 "namespace-name/benchmark-test-name"
   specified benchmark test in this namespace will be executed.

Recognized benchmark options: :quick
 :quick forces benchmark tests to be run with criterium.core/quick-benchmark,
   with no argument benchmark tests are run with criterium.core/benchmark.


Define your benchmark tests using the defbenchmark macro.
Put them into the "benchmarks" directory in your project root.

## References
    [1] https://nakkaya.com/2010/02/25/writing-leiningen-plugins-101/ 
sourceraw docstring

defbenchmarkcljmacro

(defbenchmark benchmark-name doc-string arg-list setup body)

Generate benchmark test.

The function to benchmark is defined by [benchmark-name] and [docstring] as a function name and docstring, [arg-list] as function input arguments and [body] as a function body. The return value of the [setup] function is passed to this function as arguments during benchmarking. The setup function should always return collection of arguments.

After execution, your benchmark test will return benchmark results from Criterium measurements.

A benchmark test has one argument. If it is passed as true, then benchmarking is done with criterium.core/quick-benchmark, otherwise, it is done with criterium.core/benchmark.

Put your benchmark tests into a "benchmarks" directory in your project root.

Usage

(:require [utilities.benchmark :refer :all])

(defbenchmark benchmark-sum "Sum of arguments." [x y] (identity [1 2]) (+ x y))

(benchmark-sum true)

References

[1] https://github.com/hugoduncan/criterium 
Generate benchmark test.

The function to benchmark is defined by [benchmark-name] and
[docstring] as a function name and docstring, [arg-list] as
function input arguments and [body] as a function body.
The return value of the [setup] function is passed to this
function as arguments during benchmarking. The setup function
should always return collection of arguments.

After execution, your benchmark test will return
benchmark results from Criterium measurements.

A benchmark test has one argument. If it is passed
as true, then benchmarking is done with criterium.core/quick-benchmark,
otherwise, it is done with criterium.core/benchmark.

Put your benchmark tests into a "benchmarks" directory
in your project root.

## Usage

   (:require [utilities.benchmark :refer :all])

   (defbenchmark benchmark-sum
    "Sum of arguments."
    [x y]
    (identity [1 2])
    (+ x y))

   (benchmark-sum true)

## References
    [1] https://github.com/hugoduncan/criterium 
sourceraw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close