Liking cljdoc? Tell your friends :D

deebn.dnn


classify-obvclj

(classify-obv dnn obv)

Given a DNN and a single observation, return the model's prediction.

Given a DNN and a single observation, return the model's prediction.
sourceraw docstring

dbn->dnnclj

(dbn->dnn dbn classes)

Given a pretrained Deep Belief Network, use the trained weights and biases to build a Deep Neural Network.

Given a pretrained Deep Belief Network, use the trained weights and
biases to build a Deep Neural Network.
sourceraw docstring

edn->DNNclj

(edn->DNN data)

The default map->DNN function provided by the defrecord doesn't provide us with the performant implementation (i.e. matrices and arrays from core.matrix), so this function adds a small step to ensure that.

The default map->DNN function provided by the defrecord doesn't
provide us with the performant implementation (i.e. matrices and
arrays from core.matrix), so this function adds a small step to
ensure that.
sourceraw docstring

feed-forwardclj

(feed-forward batch dnn)

Given an initial input batch and a DNN, feed the batch through the net, retaining the output of each layer.

Given an initial input batch and a DNN, feed the batch through the
net, retaining the output of each layer.
sourceraw docstring

layer-errorclj

(layer-error weights next-error output)

Calculate the error for a particular layer in a net, given the weights for the next layer, the error for the next layer, and the output for the current layer.

Calculate the error for a particular layer in a net, given the
weights for the next layer, the error for the next layer, and the
output for the current layer.
sourceraw docstring

load-dnnclj

(load-dnn filepath)

Load a DNN from disk.

Load a DNN from disk.
sourceraw docstring

net-outputclj

(net-output net input)

Propagate an input matrix through the network.

Propagate an input matrix through the network.
sourceraw docstring

prop-upclj

(prop-up input weights bias)

Given an input matrix, weight matrix, and bias vector, propagate the signal through the layer.

Given an input matrix, weight matrix, and bias vector, propagate
the signal through the layer.
sourceraw docstring

save-dnnclj

(save-dnn dnn filepath)

Save a DNN to disk.

Save a DNN to disk.
sourceraw docstring

softmax->classclj

(softmax->class x)

Get the predicted class from a softmax output.

Get the predicted class from a softmax output.
sourceraw docstring

test-dnnclj

(test-dnn dnn dataset)

Test a Deep Neural Network on a dataset. Returns an error percentage.

dataset should have the label as the last entry in each observation.

Test a Deep Neural Network on a dataset. Returns an error percentage.

dataset should have the label as the last entry in each observation.
sourceraw docstring

train-batchclj

(train-batch batch dnn observations learning-rate lambda)

Given a batch of training data and a DNN, update the weights and biases accordingly.

Given a batch of training data and a DNN, update the weights and
biases accordingly.
sourceraw docstring

train-dnnclj

(train-dnn dnn dataset params)

Given a labeled dataset, train a DNN.

The dataset should have the label as the last element of each input vector.

params is a map that may have the following keys: batch-size: default 100 epochs: default 100 learning-rate: default 0.5 lambda: default 0.1

Given a labeled dataset, train a DNN.

The dataset should have the label as the last element of each input
vector.

params is a map that may have the following keys:
batch-size: default 100
epochs: default 100
learning-rate: default 0.5
lambda: default 0.1 
sourceraw docstring

train-epochclj

(train-epoch net dataset observations learning-rate lambda batch-size)

Given a training dataset and a net, train it for one epoch (one pass over the dataset).

Given a training dataset and a net, train it for one epoch (one
pass over the dataset).
sourceraw docstring

train-top-layerclj

(train-top-layer dnn
                 dataset
                 observations
                 batch-size
                 epochs
                 learning-rate
                 lambda)

Pre-train the top logistic regression layer before moving to fine-tuning.

Pre-train the top logistic regression layer before moving to fine-tuning.
sourceraw docstring

update-layerclj

(update-layer weights
              biases
              input
              error
              learning-rate
              lambda
              batch-size
              observations)

Update the weights and biases of a layer, given the previous weights and biases, input coming into the weights, the error for the layer, the learning rate, and the batch size.

Update the weights and biases of a layer, given the previous
weights and biases, input coming into the weights, the error for the
layer, the learning rate, and the batch size.
sourceraw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close