🎼
Chapter 15. Processing Sequences Using RNNs and CNNs
This is a short companion page to our internal reading group of the book “Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow, 2nd Edition”. However I unashamedly used a lot of PyTorch examples.

Understanding LSTM

https://atcold.github.io/pytorch-Deep-Learning/en/week06/06-3/
https://atcold.github.io/pytorch-Deep-Learning/en/week06/06-3/

Input and Output

https://karpathy.github.io/2015/05/21/rnn-effectiveness/
https://karpathy.github.io/2015/05/21/rnn-effectiveness/
  • one to many: image -> caption sentence
  • many to one: sentence -> sentiment (positive / negative label)
  • many to many: a sentence in English -> a sentence in Turkish
  • the other many to many: frames of video -> coordinates of bounding boxes around an object

NMT and Seq2Seq

LSTM Cell

LSTM

Stacked LSTM

WaveNet