Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications
<p dir="ltr">Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of a recurrent neural network (RNN) has become a standard approach to solving a number of tasks ranging from language modeling to text summarization. Although LSTMs and GRUs were designe...
Saved in:
| Main Author: | |
|---|---|
| Other Authors: | , , , |
| Published: |
2019
|
| Subjects: | |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|