← all projects
Deep Learning · October 2024 · 1 min read

Deep Learning from Scratch

Backpropagation, BatchNorm, Dropout, and CNNs implemented from first principles in NumPy — then PyTorch deployment achieving 74.8% on CIFAR-10.

PythonNumPyPyTorchDeep Learning

Every component of modern deep learning, hand-coded. Vectorized backprop, BatchNorm, Dropout, CNN layers — gradient-checked to 1e-9 tolerance. Deployed to PyTorch for 74.8% CIFAR-10 accuracy.

layersAffine, ReLU, Conv2D, MaxPool, BatchNorm, Dropout optimizersSGD, Momentum, RMSProp, Adam — all from scratch performance74.8% CIFAR-10 with CNN + BN + Dropout

BatchNorm Deep Dive

Enables deep networks: Without BN, 15+ layer networks fail to train (15% accuracy). With BN, the same depth reaches 84%. Also provides robustness to learning rate choice across 5 orders of magnitude.

Dropout Regularization

Prevents overfitting through stochastic neuron masking. Optimal dropout rate of 0.4 closes the train-val gap. Inverted dropout scales activations at train time for deterministic inference.

Convolutional Networks

CNN beats FC with 10× fewer parameters. Hierarchical feature learning — edges → textures → parts → objects. Weakest classes (cat, bird, dog) share visual similarity; strongest (car, truck, plane) have distinctive shapes.

#neural-networks#backpropagation#convolutional-networks#batch-normalization

Related projects