Liking cljdoc? Tell your friends :D

clojure2d.color

Color functions.

This namespace contains color manipulation functions which can be divided into following groups:

  • Representation
  • Channel manipulations
  • Conversions
  • Blending
  • Palettes / gradients
  • Distances

Representation

Color can be represented by following types:

  • fastmath Vec4 - this is core type representing 3 color channels and alpha (RGBA). Values are double type from [0-255] range. color, gray creators returns Vec4 representation. To ensure Vec4 use [[to-color]] function.
  • fastmath Vec3 - 3 color channels, assuming alpha set to value of 255.
  • java.awt.Color - Java AWT representation. Creators are awt-color, awt-gray. Use [[to-awt-color]] to convert to this representations.
  • keyword - one of the defined names (see named-colors-list)
  • Integer - packed ARGB value. Example: 0xffaa01.
  • String - CSS ("#ab1122") or 6 chars string containg hexadecimal representation ("ffaa01")
  • any seqable - list, vector containing 2-4 elements. Conversion is done by applying content to color function.
  • nil - returning nil during color conversion.

To create color from individual channel values use color function. To create gray for given intensity call gray.

By default color is treated as RGB with values from ranges [0.0-255.0] inclusive.

Coloured list of all names

Color/ channel manipulations

You can access individual channels by calling on of the following:

  • [[red]] or [[ch0]] - to get first channel value.
  • [[green]] or [[ch1]] - to get second channel value.
  • [[blue]] or [[ch2]] - to get third channel value.
  • [[alpha]] - to get alpha value.
  • [[luma]] - to get luma or brightness (range from 0 (black) to 255 (white)).
  • hue - to get hue value in degrees (range from 0 to 360). Hexagon projection.
  • hue-polar - to get hue from polar transformation.

set-ch0, set-ch1, set-ch2 and set-alpha return new color with respective channel set to new value.

To make color darker/brighter use darken / [[lighten]] functions. Operations are done in Lab color space.

To change saturation call saturate / desaturate. Operations are done in LCH color space.

You can also rely on VectorProto from fastmath library and treat colors as vectors.

Conversions

Color can be converted from RGB to other color space (and back). List of color spaces are listed under colorspaces-list variable. There are two types of conversions:

  • raw - with names to-XXX and from-XXX where XXX is color space name. Every color space has it's own value range for each channel. (comp from-XXX to-XXX) acts almost as identity.
  • normalized - with names to-XXX* and from-XXX* where XXX is color space name. to-XXX* returns values normalized to [0-255] range. from-XXX* expects also channel values in range [0-255].

NOTE: there is no information which color space is used. It's just a matter of your code interpretation.

Color space conversion functions are collected in two maps colorspaces for raw and colorspaces* for normalized functions. Keys are color space names as keyword and values are vectors with to- fn as first and from- fn as second element.

Blending

You can blend two colors (or individual channels) using one of the methods from blends-list. All functions are collected in blends map with names as names and blending function as value.

Palettes / gradients

Links

List of all defined colors and palettes:

Palette

Palette is just sequence of colors.

There are plenty of them predefined or can be generated:

  • colourlovers-palettes contains 500 top palettes in vector from colourlovers website.
  • palette-presets contains 256 Brewer, categorical, veridis, tableau, microsoft palettes as map. See palette-presets-list for names.
  • [[paletton-palette]] function to generate palette of type: :monochromatic, :triad, :tetrad with complementary color for given hue and configuration. See also Paletton website for details.

Gradient

Gradient is continuous functions which accepts value from [0-1] range and returns color. Call gradient, gradient-easing or iq-gradient to create one.

Predefined gradients are collected in gradient-presets map. You can find them cubehelix based and generated from Inigo Quilez settings.

Conversions

To convert palette to gradient call gradient function. You can set interpolation method and colorspace.

To convert gradient to palette call sample function from fastmath library.

Call resample to resample palette to other number of colors. Internally input palette is converted to gradient and sampled back.

To make gradient from two colors you can use also gradient-easing where you interpolate between to colors using one of the easings functions from fastmath.

Linear gradient between colors is defined as lerp function.

Distances

Several functions to calculate distance between colors (euclidean, delta-xxx etc.).

Thi.ng interoperability

Thi.ng RGBA type implements ColorProto. To convert any color to RGBA use to-thing-rgba.

Color functions.

This namespace contains color manipulation functions which can be divided into following groups:

* Representation
* Channel manipulations
* Conversions
* Blending
* Palettes / gradients
* Distances

## Representation

Color can be represented by following types:

* fastmath `Vec4` - this is core type representing 3 color channels and alpha (RGBA). Values are `double` type from `[0-255]` range. [[color]], [[gray]] creators returns `Vec4` representation. To ensure `Vec4` use [[to-color]] function.
* fastmath `Vec3` - 3 color channels, assuming `alpha` set to value of `255`.
* `java.awt.Color` - Java AWT representation. Creators are [[awt-color]], [[awt-gray]]. Use [[to-awt-color]] to convert to this representations.
* `keyword` - one of the defined names (see [[named-colors-list]])
* `Integer` - packed ARGB value. Example: `0xffaa01`.
* `String` - CSS ("#ab1122") or 6 chars string containg hexadecimal representation ("ffaa01")
* any `seqable` - list, vector containing 2-4 elements. Conversion is done by applying content to [[color]] function.
* `nil` - returning `nil` during color conversion.

To create color from individual channel values use [[color]] function. To create gray for given intensity call [[gray]].

By default color is treated as `RGB` with values from ranges `[0.0-255.0]` inclusive.

[Coloured list of all names](../static/colors.html)

## Color/ channel manipulations

You can access individual channels by calling on of the following:

* [[red]] or [[ch0]] - to get first channel value.
* [[green]] or [[ch1]] - to get second channel value.
* [[blue]] or [[ch2]] - to get third channel value.
* [[alpha]] - to get alpha value.
* [[luma]] - to get luma or brightness (range from `0` (black) to `255` (white)).
* [[hue]] - to get hue value in degrees (range from 0 to 360). Hexagon projection.
* [[hue-polar]] - to get hue from polar transformation.

[[set-ch0]], [[set-ch1]], [[set-ch2]] and [[set-alpha]] return new color with respective channel set to new value.

To make color darker/brighter use [[darken]] / [[lighten]] functions. Operations are done in `Lab` color space.

To change saturation call [[saturate]] / [[desaturate]]. Operations are done in `LCH` color space.

You can also rely on `VectorProto` from `fastmath` library and treat colors as vectors.

## Conversions

Color can be converted from RGB to other color space (and back). List of color spaces are listed under [[colorspaces-list]] variable. There are two types of conversions:

* raw - with names `to-XXX` and `from-XXX` where `XXX` is color space name. Every color space has it's own value range for each channel. `(comp from-XXX to-XXX)` acts almost as identity.
* normalized - with names `to-XXX*` and `from-XXX*` where `XXX` is color space name. `to-XXX*` returns values normalized to `[0-255]` range. `from-XXX*` expects also channel values in range `[0-255]`.

NOTE: there is no information which color space is used. It's just a matter of your code interpretation.

Color space conversion functions are collected in two maps [[colorspaces]] for raw and [[colorspaces*]] for normalized functions. Keys are color space names as `keyword` and values are vectors with `to-` fn as first and `from-` fn as second element.

## Blending

You can blend two colors (or individual channels) using one of the methods from [[blends-list]]. All functions are collected in [[blends]] map with names as names and blending function as value.

## Palettes / gradients

### Links

List of all defined colors and palettes:

* [Named palettes](../static/palettes.html)
* [Colourlovers palettes](../static/colourlovers.html)
* [Gradients](../static/gradients.html)

### Palette

Palette is just sequence of colors.

There are plenty of them predefined or can be generated:

* [[colourlovers-palettes]] contains 500 top palettes in vector from [colourlovers](http://www.colourlovers.com/) website.
* [[palette-presets]] contains 256 Brewer, categorical, veridis, tableau, microsoft palettes as map. See [[palette-presets-list]] for names.
* [[paletton-palette]] function to generate palette of type: `:monochromatic`, `:triad`, `:tetrad` with complementary color for given hue and configuration. See also [Paletton](http://paletton.com) website for details.

### Gradient

Gradient is continuous functions which accepts value from `[0-1]` range and returns color. Call [[gradient]], [[gradient-easing]] or [[iq-gradient]] to create one.

Predefined gradients are collected in [[gradient-presets]] map. You can find them `cubehelix` based and generated from [Inigo Quilez](http://iquilezles.org/www/articles/palettes/palettes.htm) settings.

### Conversions

To convert palette to gradient call [[gradient]] function. You can set interpolation method and colorspace.

To convert gradient to palette call `sample` function from fastmath library.

Call [[resample]] to resample palette to other number of colors. Internally input palette is converted to gradient and sampled back.

To make gradient from two colors you can use also [[gradient-easing]] where you interpolate between to colors using one of the easings functions from `fastmath`.

Linear gradient between colors is defined as [[lerp]] function.

## Distances

Several functions to calculate distance between colors (`euclidean`, `delta-xxx` etc.).

## Thi.ng interoperability

Thi.ng `RGBA` type implements [[ColorProto]]. To convert any color to `RGBA` use [[to-thing-rgba]].
raw docstring

clojure2d.core

Main Clojure2d entry point for Canvas, Window and drawing generatively.

Basic concepts:

  • Image - BufferedImage java object used to store ARGB color information.
  • Canvas - Image which contains graphical context. You draw on it. Similar to processing Graphics object.
  • Window - Window which can display canvas, process events, keeps app/script concept. Similar to Processing sketch with display.
  • Events - Mouse and keyboard events

Protocols:

  • ImageProto - basic Image operations (Image, Canvas, Window and Pixels (see clojure2d.pixels) implement this protocol.
  • Various events protocols. Events and Window implement these:
  • Additionally Window implements PressedProto in case you want to check in draw loop if your mouse or key is pressed.

Image

Image is BufferedImage java object. Image can be read from file using load-image function or saved to file with [[save]]. ImageProto provides [[get-image]] function to access to Image object directly (if you need) There is no function which creates Image directly (use Canvas instead).

SVG

To load SVG use load-svg which creates internal Batik object. Object can be rendered to Image with transcode-svg.

Canvas

Canvas is an object which is used to draw on it. To create new one call canvas function. Provide width and height and optionally quality hint.

Quality hints are as follows:

  • :low - no antialiasing, speed optimized rendering
  • :mid - antialiasing, speed optimized rendering
  • :high - antialiasing, quality optimized rendering (default)
  • :highest - as :high plus PURE_STROKE hint, which can give strange results in some cases.

To draw on Canvas you have to create graphical context. Wrap your code into one of two functions:

  • with-canvas - binding macro (with-canvas [local-canvas canvas-object] ...)
  • with-canvas-> - threading macro (with-canvas-> canvas ...).

Each function in this macro has to accept Canvas as first parameter and return Canvas.

Canvas bound to Window and accessed via callback drawing function (a'ka Processing draw()) has graphical context created automatically.

Events

Java2d keyboard and mouse event handlers can be defined as custom multimethods separately for each window. There are following options:

  • Handlers for particular key with dispatch as a vector of window name and key character (eg. ["name" \c])
  • Handler for given key event with dispatch as a vector of window name and key event (eg. ["name" :key-pressed])
  • Handler for mouse event with dispatch as a vector of window name and mouse event (eg. ["name" :mouse-dragged])

Every event handler accepts as parameters:

  • Event object (java KeyEvent or MouseEvent) - access to the fields through defined protocols
  • Global state - state attached to window

Event handler should return new global state.

Window

Window object is responsible for displaying canvas content and processing events. You can also initialize states here.

To create window and display it call show-window. Function accepts several parameters which are described below.

Window itself simulates workflow which is available in Processing/Quil frameworks.

Internals

When window is created following things are done:

  1. Check parameters and create missing values for missed ones.
  2. If :setup function is provided, call it and use returned value as :draw-state
  3. Create JFrame, java.awt.Canvas, pack them, attach event handlers and display
  4. Set global state
  5. Run separated thread which refreshes display with given :fps, if :draw-fn is available it's called before refreshment.

Additional informations:

  • Display refreshment is done by displaying canvas on JFrame. You can set separate quality hints (same as for canvas) for this process with :hint parameter.
  • When you privide drawing function, it's called every refreshment. By default graphical context is created every call - which costs time but is safe in case you want to directly access pixels. The second variant which can be used is to create graphical context once at the moment of window creation. This variant can be forced by setting :refresher parameter to :fast.
  • You can replace canvas attached to window with replace-canvas function.
  • Window itself acts as event object (implements all event protocols)
  • Canvas and window can have different sizes. Display refreshing functions will scale up/down in such case.
  • Events and refreshment are not synchronized. Try to avoid drawing inside event multimethods.
  • You can create as many windows as you want.
  • You can check if window is visible or not with window-active? function.
  • If you provide both :draw-state and :setup. Value returned by :setup has a precedence unless is nil or false.

Parameters

Following parameters are used:

  • :canvas - canvas which is displayed on window. Default is 200x200px
  • :window-name - name of the window as a string. Used for event multimathods dispatch.
  • :w - width of the window. Default width of the canvas
  • :h - height of the window. Default height of the canvas
  • :fps - frames per second, defaults to 60
  • :draw-fn - drawing function, called before every display refreshment. Function should accept following four parameters:
    • canvas within graphical context (you don't need to use with-canvas or with-canvas-> wrappers.
    • window object
    • current frame number as a long value
    • current state
  • :setup - setup function which should accept two parameters and return initial draw state.
    • canvas withing graphical context
    • window object
  • :state - initial global state
  • :draw-state - initial local (drawing) state. If setup is provided, value returned by it will be used instead.
  • :hint - display quality hint. Use it when window and canvas have different sizes
  • :refresher - when create graphical context for draw: :fast for once or :safe for each call (default).

States

There are two states managed by library: global state connected to window and draw state connected to callback drawing function.

Global state

Each window has its own state kept in atom. The main idea is to have data which flow between event calls. Every event function accepts state and should return state data. Initial state can be set with show-window :state parameter To access current state from outside the flow call get-state. You can also mutate the state with set-state!.

Local state for drawing function

When drawing callback is used you can keep state between calls. What is returned by callback is passed as a parameter in next call. Drawing function is not synchronized with events that's why local state is introduced. You can still access and change global state.

You can init state from show-window with :draw-state or :setup parameters.

How to draw

There are plenty of functions which you can use to draw on canvas. They can be grouped to:

All operate on canvas and return canvas as a result. Obviously canvas is mutated.

Session

Session is a datetime with its hash kept globally in vector. To access current session names call session-name.

Following functions rely on session:

  • next-filename - generate unique filename based on session
  • log - save any information to the file under name based on session. See log-name.

Session is created automatically when needed. Session management functions are:

Utilities

Additional utility functions

  • date and time functions
  • to-hex formatter
Main Clojure2d entry point for Canvas, Window and drawing generatively.

Basic concepts:

* Image - `BufferedImage` java object used to store ARGB color information.
* Canvas - Image which contains graphical context. You draw on it. Similar to processing Graphics object.
* Window - Window which can display canvas, process events, keeps app/script concept. Similar to Processing sketch with display.
* Events - Mouse and keyboard events

Protocols:

* [[ImageProto]] - basic Image operations (Image, Canvas, Window and Pixels (see [[clojure2d.pixels]]) implement this protocol.
* Various events protocols. Events and Window implement these:
    * [[MouseXYProto]] - mouse position related to Window.
    * [[MouseButtonProto]] - status of mouse buttons.
    * [[KeyEventProto]] - keyboard status
    * [[ModifiersProto]] - status of special keys (Ctrl, Meta, Alt, etc.)
* Additionally Window implements [[PressedProto]] in case you want to check in draw loop if your mouse or key is pressed.

## Image

Image is `BufferedImage` java object. Image can be read from file using [[load-image]] function or saved to file with [[save]]. ImageProto provides [[get-image]] function to access to Image object directly (if you need)
There is no function which creates Image directly (use Canvas instead).

### SVG

To load SVG use `load-svg` which creates internal Batik object. Object can be rendered to Image with `transcode-svg`.

## Canvas

Canvas is an object which is used to draw on it. To create new one call [[canvas]] function. Provide width and height and optionally quality hint.

Quality hints are as follows:

* `:low` - no antialiasing, speed optimized rendering
* `:mid` - antialiasing, speed optimized rendering
* `:high` - antialiasing, quality optimized rendering (default)
* `:highest` - as `:high` plus `PURE_STROKE` hint, which can give strange results in some cases.

To draw on Canvas you have to create graphical context. Wrap your code into one of two functions:

* [[with-canvas]] - binding macro `(with-canvas [local-canvas canvas-object] ...)`
* [[with-canvas->]] - threading macro `(with-canvas-> canvas ...)`.

Each function in this macro has to accept Canvas as first parameter and return Canvas.

Canvas bound to Window and accessed via callback drawing function (a'ka Processing `draw()`) has graphical context created automatically.

## Events

Java2d keyboard and mouse event handlers can be defined as custom multimethods separately for each window.
There are following options:

* Handlers for particular key with dispatch as a vector of window name and key character (eg. `["name" \c]`)
    * [[key-pressed]] - when key is pressed
    * [[key-released]] - when key is released
    * [[key-typed]] - when key is typed
* Handler for given key event with dispatch as a vector of window name and key event (eg. `["name" :key-pressed]`)
* Handler for mouse event with dispatch as a vector of window name and mouse event (eg. `["name" :mouse-dragged]`)

Every event handler accepts as parameters:

* Event object (java KeyEvent or MouseEvent) - access to the fields through defined protocols
* Global state - state attached to window

Event handler should return new global state.

## Window

Window object is responsible for displaying canvas content and processing events. You can also initialize states here.

To create window and display it call [[show-window]]. Function accepts several parameters which are described below.

Window itself simulates workflow which is available in Processing/Quil frameworks.

### Internals

When window is created following things are done:

1. Check parameters and create missing values for missed ones.
2. If `:setup` function is provided, call it and use returned value as `:draw-state`
3. Create JFrame, java.awt.Canvas, pack them, attach event handlers and display
4. Set global state
5. Run separated thread which refreshes display with given `:fps`, if `:draw-fn` is available it's called before refreshment.

Additional informations:

* Display refreshment is done by displaying canvas on JFrame. You can set separate quality hints (same as for canvas) for this process with `:hint` parameter.
* When you privide drawing function, it's called every refreshment. By default graphical context is created every call - which costs time but is safe in case you want to directly access pixels. The second variant which can be used is to create graphical context once at the moment of window creation. This variant can be forced by setting `:refresher` parameter to `:fast`.
* You can replace canvas attached to window with [[replace-canvas]] function.
* Window itself acts as event object (implements all event protocols)
* Canvas and window can have different sizes. Display refreshing functions will scale up/down in such case.
* Events and refreshment are not synchronized. Try to avoid drawing inside event multimethods.
* You can create as many windows as you want.
* You can check if window is visible or not with [[window-active?]] function.
* If you provide both `:draw-state` and `:setup`. Value returned by `:setup` has a precedence unless is `nil` or `false`.
 
### Parameters

Following parameters are used:

* `:canvas` - canvas which is displayed on window. Default is 200x200px
* `:window-name` - name of the window as a string. Used for event multimathods dispatch.
* `:w` - width of the window. Default width of the canvas
* `:h` - height of the window. Default height of the canvas
* `:fps` - frames per second, defaults to 60
* `:draw-fn` - drawing function, called before every display refreshment. Function should accept following four parameters:
    * canvas within graphical context (you don't need to use [[with-canvas]] or [[with-canvas->]] wrappers.
    * window object
    * current frame number as a long value
    * current state
* `:setup` - setup function which should accept two parameters and return initial draw state.
    * canvas withing graphical context
    * window object
* `:state` - initial global state
* `:draw-state` - initial local (drawing) state. If `setup` is provided, value returned by it will be used instead.
* `:hint` - display quality hint. Use it when window and canvas have different sizes
* `:refresher` - when create graphical context for draw: `:fast` for once or `:safe` for each call (default).

## States

There are two states managed by library: global state connected to window and draw state connected to callback drawing function.

### Global state

Each window has its own state kept in `atom`. The main idea is to have data which flow between event calls. Every event function accepts state and should return state data.
Initial state can be set with [[show-window]] `:state` parameter 
To access current state from outside the flow call [[get-state]]. You can also mutate the state with [[set-state!]].

### Local state for drawing function

When drawing callback is used you can keep state between calls. What is returned by callback is passed as a parameter in next call. Drawing function is not synchronized with events that's why local state is introduced. You can still access and change global state.

You can init state from [[show-window]] with `:draw-state` or `:setup` parameters.

## How to draw

There are plenty of functions which you can use to draw on canvas. They can be grouped to:

* Primitives like [[point]], [[rect]], etc.
* Tranformations like [[translate]], [[rotate]], etc.
* Text rendering like [[text]], [[set-font-attributes]], etc.
* Image manipulations like [[convolve]]
* Color and style like [[set-color]], [[gradient-mode]], [[set-background]], [[set-stroke]],  etc.

All operate on canvas and return canvas as a result. Obviously canvas is mutated.

## Session

Session is a datetime with its hash kept globally in vector. To access current session names call [[session-name]].

Following functions rely on session:

* [[next-filename]] - generate unique filename based on session
* [[log]] - save any information to the file under name based on session. See [[log-name]].

Session is created automatically when needed. Session management functions are:

* [[make-session]] - create new session
* [[ensure-session]] - create new session when there is no one
* [[close-session]] - close current session
* [[session-name]] - returns current session.

## Utilities

Additional utility functions

* date and time functions
* [[to-hex]] formatter
raw docstring

clojure2d.extra.glitch

Various glitching pixel filters or functions

Filter

Use following filters with [[filter-channels]] function.

  • Slitscan - x/y slitscan simulation based on wave functions
  • Shift-channels - just shift channels
  • Mirror - mirror image along different axes
  • Slitscan2 - slitscan simulation based on vector fields
  • Fold - apply vector field on the image
  • Pix2line - convert pixel into horizontal line

Machines

Short sketches operating on images/pixels.

  • Blend - compose two images in glitchy way

All filters are equiped with random configuration generator.

Various glitching pixel filters or functions

### Filter

Use following filters with [[filter-channels]] function.

* Slitscan - x/y slitscan simulation based on wave functions
* Shift-channels - just shift channels
* Mirror - mirror image along different axes
* Slitscan2 - slitscan simulation based on vector fields
* Fold - apply vector field on the image
* Pix2line - convert pixel into horizontal line

### Machines

Short sketches operating on images/pixels.

* Blend - compose two images in glitchy way

All filters are equiped with random configuration generator.
raw docstring

clojure2d.extra.segmentation

Segment image into parts.

Currently contains only quadtree segmentation.

Segment image into parts.

Currently contains only quadtree segmentation.
raw docstring

clojure2d.extra.signal

Signal processing and generation.

Signal

Signal is array of doubles from range [-1.0.1.0] packed into Signal type.

Signal can be:

  • obtained from Pixels with pixels->signal function.
  • generated by calling [[signal-from-wave]].
  • loaded from file with load-signal function. Signal should be encoded as 16-bit signed integer big endian.

Pixels as Signal

Pixels (Image) can be treated as Signal. Conversion to Signal is based on strategies of converting image to RAW and then converting to audio. It includes channel data layout and packing into integer, encoding, endianess, etc.

To convert Pixels to Signal use pixels->signal function. To convert back use signal->pixels. signal->pixels requires target Pixels object to store result of conversion. Target is mutated then.

To filter Pixels directly (without explicit conversion to and from Signals) you can use [[filter-channels]] with effects-filter.

Wave as Signal

You can create wave function from oscillator. You can also sum waves with sum-waves.

To sample wave to signal, call wave->signal with following parameters:

  • f - wave function
  • samplerate - sample rate (samples per second)
  • seconds - how many seconds generate

File operations

You can save-signal or load-signal. Representation is 16 bit signed, big endian. Use Audacity or SoX to convert to/from audio files.

Signal processing

To process Signal use apply-effects function on it.

Effect is signal filter, created with effect multimethod. Effects can be composed with compose-effects. Effect can be treated as function and can be called for given sample.

Each effect has it's own parametrization which should be passed during creation.

List of all available effects is under effects-list value.

Effects parametrization

Each effect has its own parametrization

:simple-lowpass, :simple-highpass

  • :rate - sample rate (default 44100.0)
  • :cutoff - cutoff frequency (default 2000.0)

:biquad-eq

Biquad equalizer

  • :fc - center frequency
  • :gain - gain
  • :bw - bandwidth (default: 1.0)
  • :fs - sampling rate (defatult: 44100.0)

:biquad-hs, :biquad-ls

Biquad highpass and lowpass shelf filters

  • :fc - center frequency
  • :gain - gain
  • :slope - shelf slope (default 1.5)
  • :fs - sampling rate (default 44100.0)

:biquad-lp, :biquad-hp, :biquad-bp

Biquad lowpass, highpass and bandpass filters

  • :fc - cutoff/center frequency
  • :bw - bandwidth (default 1.0)
  • :fs - sampling rate (default 44100.0)

:dj-eq

  • :high - high frequency gain (10000Hz)
  • :mid - mid frequency gain (1000Hz)
  • :low - low frequency gain (100Hz)
  • :shelf-slope - shelf slope for high frequency (default 1.5)
  • :peak-bw - peak bandwidth for mid and low frequencies (default 1.0)
  • :rate - sampling rate (default 44100.0)

:phaser-allpass

  • :delay - delay factor (default: 0.5)

:divider

  • :denom (long, default 2.0)

:fm

Modulate and demodulate signal using frequency

  • :quant - quantization value (0.0 - if no quantization, default 10)
  • :omega - carrier factor (default 0.014)
  • :phase - deviation factor (default 0.00822)

:bandwidth-limit

https://searchcode.com/file/18573523/cmt/src/lofi.cpp#

  • :rate - sample rate (default 44100.0)
  • :freq - cutoff frequency (default 1000.0)

:distort

  • :factor - distortion factor (default 1.0)

:foverdrive

Fast overdrive

  • :drive - drive (default 2.0)

:decimator

  • :bits - bit depth (default 2)
  • :fs - decimator sample rate (default 4410.0)
  • :rate - input sample rate (default 44100.0)

:basstreble

  • :bass - bass gain (default 1.0)
  • :treble - treble gain (default 1.0)
  • :gain - gain (default 0.0)
  • :rate - sample rate (default 44100.0)
  • :slope - slope for both (default 0.4)
  • :bass-freq - bass freq (default 250.0)
  • :treble-freq - treble freq (default 4000.0)

:echo

  • :delay - delay time in seconds (default 0.5)
  • :decay - decay (amount echo in signal, default 0.5)
  • :rate - sample rate (default 44100.0)

Warning! Echo filter uses mutable array as a internal state, don't use the same filter in paraller processing.

:vcf303

  • :rate - sample rate (default 44100.0)
  • :trigger - boolean, trigger some action (default false), set true when you reset filter every line
  • :cutoff - cutoff frequency (values 0-1, default 0.8)
  • :resonance - resonance (values 0-1, default 0.8)
  • :env-mod - envelope modulation (values 0-1, default 0.5)
  • :decay - decay (values 0-1, default 1.0)
  • :gain - gain output signal (default: 1.0)

:slew-limit

http://git.drobilla.net/cgit.cgi/omins.lv2.git/tree/src/slew_limiter.c

  • :rate - sample rate
  • :maxrise - maximum change for rising signal (in terms of 1/rate steps, default 500)
  • :maxfall - maximum change for falling singal (default 500)

:mda-thru-zero

  • :rate - sample rate
  • :speed - effect rate
  • :depth
  • :mix
  • :depth-mod
  • :feedback

Warning: like :echo internal state is kept in doubles array.

Signal processing and generation.

## Signal

Signal is array of doubles from range `[-1.0.1.0]` packed into `Signal` type.

Signal can be:

* obtained from `Pixels` with [[pixels->signal]] function.
* generated by calling [[signal-from-wave]].
* loaded from file with [[load-signal]] function. Signal should be encoded as 16-bit signed integer big endian.

### Pixels as Signal

`Pixels` (`Image`) can be treated as `Signal`. Conversion to Signal is based on strategies of converting image to RAW and then converting to audio. It includes channel data layout and packing into integer, encoding, endianess, etc.

To convert `Pixels` to `Signal` use [[pixels->signal]] function. To convert back use [[signal->pixels]]. [[signal->pixels]] requires target `Pixels` object to store result of conversion. Target is mutated then.

To filter `Pixels` directly (without explicit conversion to and from Signals) you can use [[filter-channels]] with [[effects-filter]].

### Wave as Signal

You can create [[wave]] function from oscillator. You can also sum waves with [[sum-waves]].

To sample wave to signal, call [[wave->signal]] with following parameters:

* `f` - wave function
* `samplerate` - sample rate (samples per second)
* `seconds` - how many seconds generate

### File operations

You can [[save-signal]] or [[load-signal]]. Representation is 16 bit signed, big endian. Use Audacity or SoX to convert to/from audio files.

## Signal processing

To process Signal use [[apply-effects]] function on it.

Effect is signal filter, created with [[effect]] multimethod. Effects can be composed with [[compose-effects]]. Effect can be treated as function and can be called for given sample.

Each effect has it's own parametrization which should be passed during creation.

List of all available effects is under [[effects-list]] value.

### Effects parametrization

Each effect has its own parametrization

#### :simple-lowpass, :simple-highpass

* `:rate` - sample rate (default 44100.0)
* `:cutoff` - cutoff frequency (default 2000.0)

#### :biquad-eq

Biquad equalizer

* `:fc` - center frequency
* `:gain` - gain
* `:bw` - bandwidth (default: 1.0)
* `:fs` - sampling rate (defatult: 44100.0)

#### :biquad-hs, :biquad-ls

Biquad highpass and lowpass shelf filters

* `:fc` - center frequency
* `:gain` - gain
* `:slope` - shelf slope (default 1.5)
* `:fs` - sampling rate (default 44100.0)

#### :biquad-lp, :biquad-hp, :biquad-bp

Biquad lowpass, highpass and bandpass filters

* `:fc` - cutoff/center frequency
* `:bw` - bandwidth (default 1.0)
* `:fs` - sampling rate (default 44100.0)

#### :dj-eq

* `:high` - high frequency gain (10000Hz)
* `:mid` - mid frequency gain (1000Hz)
* `:low` - low frequency gain (100Hz)
* `:shelf-slope` - shelf slope for high frequency (default 1.5)
* `:peak-bw` - peak bandwidth for mid and low frequencies (default 1.0)
* `:rate` - sampling rate (default 44100.0)

#### :phaser-allpass

* `:delay` - delay factor (default: 0.5)

#### :divider

* `:denom` (long, default 2.0)

#### :fm

Modulate and demodulate signal using frequency

 * `:quant` - quantization value (0.0 - if no quantization, default 10)
 * `:omega` - carrier factor (default 0.014)
 * `:phase` - deviation factor (default 0.00822)

#### :bandwidth-limit

https://searchcode.com/file/18573523/cmt/src/lofi.cpp#

* `:rate` - sample rate (default 44100.0)
* `:freq` - cutoff frequency (default 1000.0)

#### :distort

* `:factor` - distortion factor (default 1.0)

#### :foverdrive

Fast overdrive

* `:drive` - drive (default 2.0)

#### :decimator

* `:bits` - bit depth (default 2)
* `:fs` - decimator sample rate (default 4410.0)
* `:rate` - input sample rate (default 44100.0)

#### :basstreble

* `:bass` - bass gain (default 1.0)
* `:treble` - treble gain (default 1.0)
* `:gain` - gain (default 0.0)
* `:rate` - sample rate (default 44100.0)
* `:slope` - slope for both (default 0.4)
* `:bass-freq` - bass freq (default 250.0)
* `:treble-freq` - treble freq (default 4000.0)

#### :echo

* `:delay` - delay time in seconds (default 0.5)
* `:decay` - decay (amount echo in signal, default 0.5)
* `:rate` - sample rate (default 44100.0)

_Warning! Echo filter uses mutable array as a internal state, don't use the same filter in paraller processing._

#### :vcf303

* `:rate` - sample rate (default 44100.0)
* `:trigger` - boolean, trigger some action (default `false`), set true when you reset filter every line
* `:cutoff` - cutoff frequency (values 0-1, default 0.8)
* `:resonance` - resonance (values 0-1, default 0.8)
* `:env-mod` - envelope modulation (values 0-1, default 0.5)
* `:decay` - decay (values 0-1, default 1.0)
* `:gain` - gain output signal (default: 1.0)

#### :slew-limit

http://git.drobilla.net/cgit.cgi/omins.lv2.git/tree/src/slew_limiter.c

* `:rate` - sample rate
* `:maxrise` - maximum change for rising signal (in terms of 1/rate steps, default 500)
* `:maxfall` - maximum change for falling singal (default 500)

#### :mda-thru-zero

* `:rate` - sample rate
* `:speed` - effect rate
* `:depth`
* `:mix`
* `:depth-mod`
* `:feedback`

_Warning: like `:echo` internal state is kept in doubles array._
raw docstring

clojure2d.extra.utils

Set of various utilities which can be used to display various objects.

Set of various utilities which can be used to display various objects.
raw docstring

clojure2d.pixels

Operations on pixel levels.

Content

Namespace defines three main concepts:

  • Pixels - channel values packed into array.
  • Processors - parallel Pixels processing functions (like filters).
  • Bins - log density renderer

Pixels

Pixels is type which represents image as int array divided into color channels. Layout is linear and interleaved which means that array is 1D and each pixel is represented by four consecutive values R, G, B, A. After first row goes second and so on.

Pixels allows mutation, you can read and set channel value or color:

  • [[get-value]], [[set-value]] - read or set channel value for given pixel and channel. Value should be within [0-255] range.
  • [[get-color]], [[set-color]] - read or set color for given pixel. Returned color has [[Vec4]] type.
  • [[get-channel]], [[set-channel]] - read or set whole channel as ints.

Pixel access can be made by (x,y) coordinates or by index which is equivalent to (+ x (* y width)).

Pixels implement [[ImageProto]].

Creation / conversions

To create empty Pixels, call pixels.

You can also get and set Pixels from and to Images and Canvases or read from file.

Processors

Library supports several processing functions and helpers to parallely manipulate channels or colors. All functions are not destrictive, that means new object is created to store result of manipulation. Every processor accept one or more filtering functions which do the job. There are three main functions:

  • filter-colors - to process colors. Uses function f which accepts color and should return color. Can be used to convert Pixels between different color spaces.
  • filter-channels - to process channel values. Uses function f which accepts channel number (values from 0 to 3), target Pixels and source Pixels and returns integer. You can provide different function for every channel. Can be used to apply filter (like blur).
  • blend-channels - to process pair of Pixels. Uses function f which accepts channel number, target and two Pixel values. compose-channels wrapper can be used to compose two Pixels using one of the blending functions defined in [[clojure2d.colors]] namespace.

Additionally other processing functions are prepared in case you want write own filters or converters:

Color space

To convert whole Pixels into different color space use filter-colors and pass one of the color space conversion functions defined under [[colorspaces*]]. Always use normalized version.

(filter-colors c/to-HSB* pixels-object)

Filters

There are several ready to use filters. All defined under [[filters-list]] variable. Some of the filters are creators and should be called with parametrization.

(filter-channels gaussian-blur-3 pixels-object)
(filter-channels (posterize 10) pixels-object)

Composing

To compose two Pixels use compose-channels and use name of composing function defined [[blends-list]]. Instead of name you can pass custom composing function.

(compose-channels :multiply pixels-1 pixels-2)

Log Density Rendering

Log Density Renderer was orginally created for fractal flames rendering and produces very smooth results. Details are described in this paper.

Renderer is point based (no other primitives) and supports selection of antialiasing (reconstruction) filters. Density estimation is not supported.

Rendering algorithm collects color channels values and counts number of hits for each pixel. For each pixel weighted average of all color values is calculated and log of number of hits gives alpha value. Pixel color is blended with background using alpha.

Rendering

First you have to create renderer with renderer function. By default no filter is used. In case you want to use filter call with: filter name as keyword (see below), optional: filter radius (default 2.0) and other filter parameters.

To set point call [[set-color]].

Antialiasing filters

Below you have list of all available antialiasing filters:

  • :gaussian - two parameters, radius and alpha (default: 2.0)
  • :box - one parameter, radius (use 0.5)
  • :sinc - Lanczos filter, two parameters, radius and tau (default: 1.0)
  • :mitchell - Mitchell-Netravali filter, three parameters, radius, B and C (default: 1/3)
  • :cubic - one parameter, radius
  • :catmull - one parameter, radius
  • :triangle - one parameter, radius
  • :cosinebell - two parameters, radius and xm (default: 0.5)
  • :blackmann-harris - one parameter, radius

Converting

To convert renderer to Pixels just call [[to-pixels]] method with optional configuration. Configuration gives you possibility to control process of transformation to RGB data.

Configuration is a map with following fields:

  • :background - color of the background (default: :black)
  • :gamma-alpha - gamma correction for alpha, to adjust blending strength.
  • :gamma-color - gamma correction for color, to adjust intensity
  • :intensity: 1.0 - use calculated color 0.0 - use gamma corrected color (0.0-1.0) - mix between above
  • :saturation - adjust saturation (0-2)
  • :brightness - adjust brightness (0-2)
  • :contrast - adjust contrast (0-2)

Parallel rendering

Construction of renderer enables parallel computing. Just create as many renderers as you want (you may use [[available-tasks]] value), run rendering in separate threads and then merge result with merge-renderers.

Operations on pixel levels.

## Content

Namespace defines three main concepts:

* Pixels - channel values packed into array.
* Processors - parallel Pixels processing functions (like filters).
* Bins - log density renderer

## Pixels

Pixels is type which represents image as int array divided into color channels. Layout is linear and interleaved which means that array is 1D and each pixel is represented by four consecutive values R, G, B, A. After first row goes second and so on.

Pixels allows mutation, you can read and set channel value or color:

* [[get-value]], [[set-value]] - read or set channel value for given pixel and channel. Value should be within `[0-255]` range.
* [[get-color]], [[set-color]] - read or set color for given pixel. Returned color has [[Vec4]] type.
* [[get-channel]], [[set-channel]] - read or set whole channel as `ints`.

Pixel access can be made by `(x,y)` coordinates or by index which is equivalent to `(+ x (* y width))`.

Pixels implement [[ImageProto]].

### Creation / conversions

To create empty Pixels, call [[pixels]].

You can also get and set Pixels from and to Images and Canvases or read from file.

## Processors

Library supports several processing functions and helpers to parallely manipulate channels or colors. All functions are not destrictive, that means new object is created to store result of manipulation. Every processor accept one or more filtering functions which do the job.
There are three main functions:

* [[filter-colors]] - to process colors. Uses function `f` which accepts color and should return color. Can be used to convert Pixels between different color spaces.
* [[filter-channels]] - to process channel values. Uses function `f` which accepts channel number (values from 0 to 3), target Pixels and source Pixels and returns integer. You can provide different function for every channel. Can be used to apply filter (like blur).
* [[blend-channels]] - to process pair of Pixels. Uses function `f` which accepts channel number, target and two Pixel values. [[compose-channels]] wrapper can be used to compose two Pixels using one of the blending functions defined in [[clojure2d.colors]] namespace.

Additionally other processing functions are prepared in case you want write own filters or converters:

* [[filter-colors-xy]] - process colors using function `f` which accepts Pixels and current position.
* [[filter-channel]] - iterate through channel, `f` accepts channel value and returns new channel value
* [[filter-channel-xy]] - iterate through channel, `f` accepts channel, Pixels and x,y position
* [[blend-channels]] and [[blend-channel-xy]] - similar to two above, `f` accepts two Pixels instead of one.

### Color space

To convert whole Pixels into different color space use [[filter-colors]] and pass one of the color space conversion functions defined under [[colorspaces*]]. Always use normalized version.

```
(filter-colors c/to-HSB* pixels-object)
```

### Filters

There are several ready to use filters. All defined under [[filters-list]] variable. Some of the filters are creators and should be called with parametrization.

```
(filter-channels gaussian-blur-3 pixels-object)
(filter-channels (posterize 10) pixels-object)
```

### Composing

To compose two Pixels use [[compose-channels]] and use name of composing function defined [[blends-list]]. Instead of name you can pass custom composing function.

```
(compose-channels :multiply pixels-1 pixels-2)
```

## Log Density Rendering

Log Density Renderer was orginally created for fractal flames rendering and produces very smooth results. Details are described in this [paper](http://flam3.com/flame.pdf).

Renderer is point based (no other primitives) and supports selection of antialiasing (reconstruction) filters. Density estimation is not supported.

Rendering algorithm collects color channels values and counts number of hits for each pixel. For each pixel weighted average of all color values is calculated and log of number of hits gives alpha value. Pixel color is blended with background using alpha.

### Rendering

First you have to create renderer with [[renderer]] function. By default no filter is used. 
In case you want to use filter call with: filter name as keyword (see below), optional: filter radius (default 2.0) and other filter parameters.

To set point call [[set-color]].

#### Antialiasing filters

Below you have list of all available antialiasing filters:

* :gaussian - two parameters, radius and alpha (default: 2.0)
* :box - one parameter, radius (use 0.5)
* :sinc - Lanczos filter, two parameters, radius and tau (default: 1.0)
* :mitchell - Mitchell-Netravali filter, three parameters, radius, B and C (default: 1/3)
* :cubic - one parameter, radius
* :catmull - one parameter, radius
* :triangle - one parameter, radius
* :cosinebell - two parameters, radius and xm (default: 0.5)
* :blackmann-harris - one parameter, radius

### Converting

To convert renderer to Pixels just call [[to-pixels]] method with optional configuration. Configuration gives you possibility to control process of transformation to RGB data.

Configuration is a map with following fields:

* :background - color of the background (default: :black)
* :gamma-alpha - gamma correction for alpha, to adjust blending strength.
* :gamma-color - gamma correction for color, to adjust intensity
* :intensity:
    1.0 - use calculated color
    0.0 - use gamma corrected color
    (0.0-1.0) - mix between above
* :saturation - adjust saturation (0-2)
* :brightness - adjust brightness (0-2)
* :contrast - adjust contrast (0-2)

### Parallel rendering

Construction of renderer enables parallel computing. Just create as many renderers as you want (you may use [[available-tasks]] value), run rendering in separate threads and then merge result with [[merge-renderers]].
raw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close