(->ada-delta-config {:keys [rho epsilon]
:or {rho AdaDelta/DEFAULT_ADADELTA_RHO
epsilon AdaDelta/DEFAULT_ADADELTA_EPSILON}})(->ada-grad-config {:keys [learning-rate epsilon]
:or {learning-rate AdaGrad/DEFAULT_ADAGRAD_LEARNING_RATE
epsilon AdaGrad/DEFAULT_ADAGRAD_EPSILON}})(->ada-max-config {:keys [learning-rate beta1 beta2 epsilon]
:or {learning-rate AdaMax/DEFAULT_ADAMAX_LEARNING_RATE
beta1 AdaMax/DEFAULT_ADAMAX_BETA1_MEAN_DECAY
beta2 AdaMax/DEFAULT_ADAMAX_BETA2_VAR_DECAY
epsilon AdaMax/DEFAULT_ADAMAX_EPSILON}})(->adam-config {:keys [learning-rate beta1 beta2 epsilon]
:or {learning-rate Adam/DEFAULT_ADAM_LEARNING_RATE
beta1 Adam/DEFAULT_ADAM_BETA1_MEAN_DECAY
beta2 Adam/DEFAULT_ADAM_BETA2_VAR_DECAY
epsilon Adam/DEFAULT_ADAM_EPSILON}})(->ams-grad-config {:keys [learning-rate beta1 beta2 epsilon]
:or {learning-rate AMSGrad/DEFAULT_AMSGRAD_LEARNING_RATE
beta1 AMSGrad/DEFAULT_AMSGRAD_BETA1_MEAN_DECAY
beta2 AMSGrad/DEFAULT_AMSGRAD_BETA2_VAR_DECAY
epsilon AMSGrad/DEFAULT_AMSGRAD_EPSILON}})(->gradient-updater-config {:keys [type] :as options})(->gradient-updater-config type options)Builds a gradient updater configuration (IUpdater) from updater-kind and specific arguments. Basically requires two arguments, with both being able to be nested into a single map. Throws an exception if the updater does not exist. See documentation Input :
Builds a gradient updater configuration (IUpdater)
from updater-kind and specific arguments.
Basically requires two arguments, with both being
able to be nested into a single map.
Throws an exception if the updater does not exist.
See documentation
Input :
- kind : gradient updater kind as a keyword
- options : gradient updater specific configuration
Usage :
(->gradient-updater-config :ada-delta {:rho 1.256})
~
(->gradient-updater-config {:kind :ada-delta , options {:rho 1.256}})(->nadam-config {:keys [learning-rate beta1 beta2 epsilon]
:or {learning-rate Nadam/DEFAULT_NADAM_LEARNING_RATE
beta1 Nadam/DEFAULT_NADAM_BETA1_MEAN_DECAY
beta2 Nadam/DEFAULT_NADAM_BETA2_VAR_DECAY
epsilon Nadam/DEFAULT_NADAM_EPSILON}})(->nesterovs-config {:keys [learning-rate momentum]
:or {learning-rate Nesterovs/DEFAULT_NESTEROV_LEARNING_RATE
momentum Nesterovs/DEFAULT_NESTEROV_MOMENTUM}})(->rms-prop-config {:keys [learning-rate rms-decay epsilon]
:or {learning-rate RmsProp/DEFAULT_RMSPROP_LEARNING_RATE
rms-decay RmsProp/DEFAULT_RMSPROP_RMSDECAY
epsilon RmsProp/DEFAULT_RMSPROP_EPSILON}})cljdoc builds & hosts documentation for Clojure/Script libraries
| Ctrl+k | Jump to recent docs |
| ← | Move to previous article |
| → | Move to next article |
| Ctrl+/ | Jump to the search field |