Note: This video may require joining the NVIDIA Developer Program or login

GTC Silicon Valley-2019 ID:S9251:Sparse Attentive Backtracking: Temporal Credit Assignment Through Reminding

Nan RosemaryKe(MILA, University of Montreal)
Learning long-term dependencies in extended temporal sequences requires credit assignment to events far in the past. The most common method for training recurrent neural networks, backpropagation through time, requires credit information to be propagated backwards through every single step of the forward computation, potentially over thousands or millions of time steps. We'll describe how this becomes computationally expensive or even infeasible when used with long sequences. Although biological brains are unlikely to perform such detailed reverse replay over very long sequences of internal states, humans often reminded of past memories or mental states associated with their current mental states. We'll discuss the hypothesis that such memory associations between past and present could be used for credit assignment through arbitrarily long sequences, propagating the credit assigned to the current state to the associated past state.

View the slides (pdf)