Oleg Zabluda's blog
Monday, September 19, 2016
 
A Critical Review of Recurrent Neural Networks for Sequence Learning (2015) Zachary C.

A Critical Review of Recurrent Neural Networks for Sequence Learning (2015) Zachary C. Lipton, John Berkowitz, Charles Elkan
"""
Countless learning tasks require dealing with sequential data. Image captioning, speech synthesis, and music generation all require that a model produce outputs that are sequences. In other domains, such as time series prediction, video analysis, and musical information retrieval, a model must learn from inputs that are sequences. Interactive tasks, such as translating natural language, engaging in dialogue, and controlling a robot, often demand both capabilities. Recurrent neural networks (RNNs) are connectionist models that capture the dynamics of sequences via cycles in the network of nodes. Unlike standard feedforward neural networks, recurrent networks retain a state that can represent information from an arbitrarily long context window. [...] When appropriate, we reconcile conflicting notation and nomenclature. Our goal is to provide a self-contained explication of the state of the art together with a historical perspective and references to primary research.
"""
http://arxiv.org/abs/1506.00019

Labels:


| |

Home

Powered by Blogger