Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications

<p dir="ltr">Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of a recurrent neural network (RNN) has become a standard approach to solving a number of tasks ranging from language modeling to text summarization. Although LSTMs and GRUs were designe...

Full description

Saved in:
Bibliographic Details
Main Author: Rumen Dangovski (6658229) (author)
Other Authors: Li Jing (445177) (author), Preslav Nakov (17760905) (author), Mićo Tatalović (18619243) (author), Marin Soljačić (3961958) (author)
Published: 2019
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!