Liking cljdoc? Tell your friends :D

clj-djl.nn


addclj

(add net block)
source

batch-flattenclj

(batch-flatten array & more)
source

batch-flatten-blockclj

(batch-flatten-block & more)
source

batchnorm-blockclj

(batchnorm-block)
(batchnorm-block {:keys [axis center epsilon momentum scale]})
source

buildclj

(build builder)
source

clearclj

(clear block)
source

cov2d-blockclj

(cov2d-block {:keys [kernel-shape filters bias dilation groups padding stride]})
source

dropoutclj

(dropout {:keys [rate]})
source

eluclj

(elu data alpha)

Applies ELU(Exponential Linear Unit) activation on the input NDArray or NDList

Applies ELU(Exponential Linear Unit) activation on the input NDArray or
NDList
sourceraw docstring

elu-blockclj

(elu-block alpha)

Creates a LambdaBlock that applies the ELU activation function in its forward function

ELU <- (if (> x 0) x (* alpha (- (pow e x) 1)))

Creates a LambdaBlock that applies the ELU activation function in its forward
function

ELU <- (if (> x 0) x (* alpha (- (pow e x) 1)))
sourceraw docstring

forwardclj

(forward block inputs)
(forward block paramstore inputs labels-or-training? & [params])
source

geluclj

(gelu data)

Applies GELU(Gausian Error Linear Unit) activation on the input NDArray or NDList

Applies GELU(Gausian Error Linear Unit) activation on the input NDArray or
NDList
sourceraw docstring

gelu-blockclj

(gelu-block)

Creates a LambdaBlock that applies the GELU activation function in its forward function

Creates a LambdaBlock that applies the GELU activation function in its forward
function
sourceraw docstring

get-parametersclj

(get-parameters block)
source

identity-blockclj

(identity-block)
source

initializeclj

(initialize block manager datatype- & input-shapes)
source

leaky-reluclj

(leaky-relu data alpha)
source

leaky-relu-blockclj

(leaky-relu-block alpha)

Create a LamdaBlock with LeakyReLU as forward function:

LeakyRelu = (if (>= x 0) x (* neg_slope x))

Create a LamdaBlock with LeakyReLU as forward function:

LeakyRelu = (if (>= x 0) x (* neg_slope x))
sourceraw docstring

linearclj

(linear {:keys [bias units]})
source

linear-blockclj

source

linear-builderclj

(linear-builder)
source

mishclj

(mish data)

Applies Mish activation on the input NDArray or NDList

Applies Mish activation on the input NDArray or NDList
sourceraw docstring

mish-blockclj

(mish-block)

Creates a LambdaBlock that applies the Mish activation function in its forward function

Creates a LambdaBlock that applies the Mish activation function in its forward
function
sourceraw docstring

new-linear-builderclj

source

new-normal-initializerclj

source

normal-initializerclj

(normal-initializer)
(normal-initializer sigma)
source

opt-biasclj

(opt-bias builder bias)
source

prelu-blockclj

(prelu-block)

Creates a LambdaBlock that applies the PreLU activation function in its forward function, the neg_slope is learnt during training

Creates a LambdaBlock that applies the PreLU activation function in its forward
function, the neg_slope is learnt during training
sourceraw docstring

reluclj

(relu data)
source

relu-blockclj

(relu-block)
source

seluclj

(selu data)

Applies SELU(Scaled Exponential Linear Unit) activation on the input NDArray or NDList

Applies SELU(Scaled Exponential Linear Unit) activation on the input NDArray or
NDList
sourceraw docstring

selu-blockclj

(selu-block)

Creates a LambdaBlock that applies the SELU activation function in its forward function

SELU <- (* lambda (if (> x 0) x (* alpha (- (pow e x) 1)))), where lamda is 1.0507009873554804934193349852946 and alpha is 1.6732632423543772848170429916717

Creates a LambdaBlock that applies the SELU activation function in its forward
function

SELU <- (* lambda (if (> x 0) x (* alpha (- (pow e x) 1)))), where lamda is
1.0507009873554804934193349852946 and alpha is
1.6732632423543772848170429916717
sourceraw docstring

sequentialclj

(sequential)
(sequential {:keys [blocks initializer parameter]})
source

sequential-blockclj

source

set-initializerclj

(set-initializer net initializer parameter)
source

set-unitsclj

(set-units builder unit)
source

sigmoidclj

(sigmoid data)
source

sigmoid-blockclj

(sigmoid-block)
source

softplusclj

(softplus data)
source

softplus-blockclj

(softplus-block)
source

swishclj

(swish data beta)

Applies Swish activation on the input NDArray or NDList

Applies Swish activation on the input NDArray or NDList
sourceraw docstring

swish-blockclj

(swish-block beta)

Creates a LambdaBlock that applies the Swish activation function in its forward function

Creates a LambdaBlock that applies the Swish activation function in its forward
function
sourceraw docstring

tanhclj

(tanh data)
source

tanh-blockclj

(tanh-block)
source

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close