(batchnorm-block)
(batchnorm-block {:keys [axis center epsilon momentum scale]})
(cov2d-block {:keys [kernel-shape filters bias dilation groups padding stride]})
(elu data alpha)
Applies ELU(Exponential Linear Unit) activation on the input NDArray or NDList
Applies ELU(Exponential Linear Unit) activation on the input NDArray or NDList
(elu-block alpha)
Creates a LambdaBlock that applies the ELU activation function in its forward function
ELU <- (if (> x 0) x (* alpha (- (pow e x) 1)))
Creates a LambdaBlock that applies the ELU activation function in its forward function ELU <- (if (> x 0) x (* alpha (- (pow e x) 1)))
(forward block inputs)
(forward block paramstore inputs labels-or-training? & [params])
(gelu data)
Applies GELU(Gausian Error Linear Unit) activation on the input NDArray or NDList
Applies GELU(Gausian Error Linear Unit) activation on the input NDArray or NDList
(gelu-block)
Creates a LambdaBlock that applies the GELU activation function in its forward function
Creates a LambdaBlock that applies the GELU activation function in its forward function
(leaky-relu-block alpha)
Create a LamdaBlock with LeakyReLU as forward function:
LeakyRelu = (if (>= x 0) x (* neg_slope x))
Create a LamdaBlock with LeakyReLU as forward function: LeakyRelu = (if (>= x 0) x (* neg_slope x))
(mish data)
Applies Mish activation on the input NDArray or NDList
Applies Mish activation on the input NDArray or NDList
(mish-block)
Creates a LambdaBlock that applies the Mish activation function in its forward function
Creates a LambdaBlock that applies the Mish activation function in its forward function
(prelu-block)
Creates a LambdaBlock that applies the PreLU activation function in its forward function, the neg_slope is learnt during training
Creates a LambdaBlock that applies the PreLU activation function in its forward function, the neg_slope is learnt during training
(selu data)
Applies SELU(Scaled Exponential Linear Unit) activation on the input NDArray or NDList
Applies SELU(Scaled Exponential Linear Unit) activation on the input NDArray or NDList
(selu-block)
Creates a LambdaBlock that applies the SELU activation function in its forward function
SELU <- (* lambda (if (> x 0) x (* alpha (- (pow e x) 1)))), where lamda is 1.0507009873554804934193349852946 and alpha is 1.6732632423543772848170429916717
Creates a LambdaBlock that applies the SELU activation function in its forward function SELU <- (* lambda (if (> x 0) x (* alpha (- (pow e x) 1)))), where lamda is 1.0507009873554804934193349852946 and alpha is 1.6732632423543772848170429916717
(swish data beta)
Applies Swish activation on the input NDArray or NDList
Applies Swish activation on the input NDArray or NDList
(swish-block beta)
Creates a LambdaBlock that applies the Swish activation function in its forward function
Creates a LambdaBlock that applies the Swish activation function in its forward function
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close