This document lists the recurrent neural network API in Gluon:
.. currentmodule:: mxnet.gluon.rnn
Recurrent layers can be used in Sequential
with other regular neural network layers.
For example, to construct a sequence labeling model where a prediction is made for each
time-step:
model = mx.gluon.nn.Sequential()
with model.name_scope():
model.add(mx.gluon.nn.Embedding(30, 10))
model.add(mx.gluon.rnn.LSTM(20))
model.add(mx.gluon.nn.Dense(5, flatten=False))
model.initialize()
model(mx.nd.ones((2,3)))
.. autosummary::
:nosignatures:
RNN
LSTM
GRU
Recurrent cells allows fine-grained control when defining recurrent models. User
can explicit step and unroll to construct complex networks. It provides more
flexibility but is slower than recurrent layers. Recurrent cells can be stacked
with SequentialRNNCell
:
model = mx.gluon.rnn.SequentialRNNCell()
with model.name_scope():
model.add(mx.gluon.rnn.LSTMCell(20))
model.add(mx.gluon.rnn.LSTMCell(20))
states = model.begin_state(batch_size=32)
inputs = mx.nd.random.uniform(shape=(5, 32, 10))
outputs = []
for i in range(5):
output, states = model(inputs[i], states)
outputs.append(output)
.. autosummary::
:nosignatures:
RNNCell
LSTMCell
GRUCell
RecurrentCell
SequentialRNNCell
BidirectionalCell
DropoutCell
ZoneoutCell
ResidualCell
.. automodule:: mxnet.gluon.rnn
:members:
:imported-members:
Can you improve this documentation? These fine people already did:
Sheng Zha, Aaron Markham & Eric Junyuan XieEdit on GitHub
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close