Recurrent Neural Networks
Adversarial Vulnerabilities of Human Decision-making
November, 2020
Abstract
BlackOut: Speeding up Recurrent Neural Network Language Models With Very Large Vocabularies
Shihao Ji, S. V. N. Vishwanathan, Nadathur Satish, Michael J. Anderson, Pradeep Dubey
Mar 2016
Abstract:
Regularizing RNNs by Stabilizing Activations
David Krueger, Roland Memisevic
Apr 2016
Abstract:
Tuning Recurrent Neural Networks with Reinforcement Learning
Natasha Jaques, Shixiang Gu, Richard E. Turner, Douglas Eck
November 2016
Abstract:
On Multiplicative Integration with Recurrent Neural Networks
Yuhuai Wu, Saizheng Zhang, Ying Zhang, Yoshua Bengio, Ruslan Salakhutdinov
(Submitted on 21 Jun 2016)
Abstract:
Path-Normalized Optimization of Recurrent Neural Networks with ReLU Activations
Behnam Neyshabur, Yuhuai Wu, Ruslan Salakhutdinov, Nathan Srebro
May, 2016
Abstract:
We investigate the parameter-space geometry of recurrent neural networks (RNNs), and develop an adaptation of path-SGD optimization method, attuned to this geometry, that can learn plain RNNs with ReLU activations. On several datasets that require capturing long-term dependency structure, we show that path-SGD can significantly improve trainability of ReLU RNNs compared to RNNs trained with SGD, even with various recently suggested initialization schemes.