← all projects
Deep Learning · November 2024 · 1 min read

RNN Sequence Modeling

Recurrent networks from scratch — forward pass, backpropagation through time, and gradient flow analysis. Vectorized NumPy implementation validated to 5e-5 tolerance.

PythonNumPyPyTorchSequence Models

RNN forward and backward passes, hand-derived and vectorized. Gradient-checked implementation with 5e-5 max error. Sequence targets (x, y) predicted across 10 timesteps, validated across four sample batches.

implementationForward pass (all_h, last_h), BPTT, gradient check validationMax error 5e-5 across all components architecturesVanilla RNN, LSTM, GRU — compared head-to-head

Gradient Flow Analysis

The vanishing gradient problem in action. Vanilla RNN gradients collapse to 1e-7 at 50 timesteps — LSTM maintains 1e-1. Hidden state evolution shows how cell gates preserve long-range information that vanilla RNNs lose.

#rnn#lstm#sequence-modeling#backpropagation

Related projects