Topics
math for ai engineering
Mathematical foundations for AI engineering — linear algebra, calculus, probability, and statistics.
39 notes
machine learning
Core ML concepts — supervised and unsupervised learning, algorithms, and model evaluation.
0 notes
deep learning
Neural networks, architectures, training techniques, and deep learning fundamentals.
0 notes
generative ai
Large language models, diffusion models, prompting, and the landscape of generative AI.
0 notes
ai engineering
Building production AI systems — deployment, evaluation, tooling, and infrastructure.
0 notes
software engineering
Principles, patterns, and practices for writing robust, maintainable software.
0 notes
system design
Designing scalable, reliable distributed systems — architecture, trade-offs, and patterns.
0 notes
clean code
Writing readable, maintainable code — naming, structure, refactoring, and code review.
0 notes
All Notes
A random variable assigns a number to each outcome of a random process. It's the bridge between abstract probability and concrete calculations — and the foundation for every distribution, expectation, and loss function in ML.
Variance measures how spread out a distribution is — the average squared distance from the mean. Standard deviation is its square root, in the same units as the data. These underlie batch normalization, weight initialization, and uncertainty quantification.
The expectation of a random variable is its long-run average — the value you'd expect if you repeated the experiment many times. It's the foundation of loss functions, gradient estimates, and reasoning about model performance.
Two events are independent if knowing one happened tells you nothing about whether the other happened. Independence is the assumption that makes most ML algorithms tractable.
Probability is a number between 0 and 1 that measures how likely something is. A handful of rules let you combine and reason about probabilities — they're the foundation of everything in statistics and ML.