From the course: Building Recommender Systems with Machine Learning and AI

Unlock this course with a free trial

Join today to access over 23,200 courses taught by industry experts.

Training recurrent neural networks

Training recurrent neural networks

- [Instructor] Training RNNs, just like CNNs, is hard, and in some ways it's even harder. The main twist here is that we need to backpropagate not only through the neural network itself and all of its layers, but also through time. From a practical standpoint, every one of those time steps ends up looking like another layer in our neural network while we're training to train it, and those time steps can add up fast. Over time we can end up with a deeper and deeper neural network that we need to train, and the cost of actually performing gradient descent on that increasingly deep neural network becomes increasingly large. So to put an upper cap on that training time, often we limit the backpropagation to a limited number of time steps. We call this truncated backpropagation through time. It's something to keep in mind when you're training in RNN. You not only need to backpropagate through the neural network topology that you've created, you also need to backpropagate through all of the…

Contents