Oleg Zabluda's blog
Thursday, April 20, 2017
 
Microsoft Wins ImageNet 2015 through Highway Net (or Feedforward LSTM) without Gates | Jürgen Schmidhuber
Microsoft Wins ImageNet 2015 through Highway Net (or Feedforward LSTM) without Gates | Jürgen Schmidhuber
"""
Microsoft Research dominated the ImageNet 2015 contest with a very deep neural network of 150 layers [1]. Congrats to Kaiming He & Xiangyu Zhang & Shaoqing Ren & Jian Sun on the great results [2]!

Their Residual Net or ResNet [1] of December 2015 is a special case of our Highway Networks [4] of May 2015, the first very deep feedforward networks with hundreds of layers. Highway nets are essentially feedforward versions of recurrent Long Short-Term Memory (LSTM) networks [3] with forget gates (or gated recurrent units) [5].

Let g, t, h denote non-linear differentiable functions. Each non-input layer of a Highway Net computes g(x)x + t(x)h(x), where x is the data from the previous layer. (Like LSTM [3] with forget gates [5] for recurrent networks.)

The CNN layers of ResNets [1] do the same with g(x)=1 (a typical Highway Net initialisation) and t(x)=1, essentially like a Highway Net or a feedforward LSTM [3] without gates.
"""
http://people.idsia.ch/~juergen/microsoft-wins-imagenet-through-feedforward-LSTM-without-gates.html
http://people.idsia.ch/~juergen/microsoft-wins-imagenet-through-feedforward-LSTM-without-gates.html

Labels:


| |

Home

Powered by Blogger