Deep Learning Math
Backpropagation
Backpropagation is how neural networks learn — it applies the chain rule backwards through every layer to compute how much each weight contributed to the error, then nudges each weight in the right direction.
Activation Functions
Activation functions introduce non-linearity into neural networks. Without them, stacking layers is mathematically equivalent to a single layer — the network can't learn curves, boundaries, or complex patterns.
Softmax and Cross-Entropy
Softmax converts raw scores into probabilities. Cross-entropy measures how wrong those probabilities are. Together, they form the standard output layer + loss function for classification — and they're mathematically designed to complement each other.
Vectorized Operations
Vectorized operations replace slow Python loops with fast matrix math. In neural networks, a single matrix multiplication processes an entire batch of inputs simultaneously — this is what makes training on GPUs possible and practical.