MnemosyneMnemosyne

Probability

The mathematics of uncertainty — distributions, inference, and the probabilistic foundations of every loss function and generative model in AI.

Probability Rules

Probability is a number between 0 and 1 that measures how likely something is. A handful of rules let you combine and reason about probabilities — they're the foundation of everything in statistics and ML.

Independence

Two events are independent if knowing one happened tells you nothing about whether the other happened. Independence is the assumption that makes most ML algorithms tractable.

Bayes' Theorem

Bayes' theorem is the rule for updating your beliefs when you get new evidence. Prior belief + new data → updated belief. It's the mathematical foundation for how any rational agent should reason under uncertainty.

Random Variables

A random variable assigns a number to each outcome of a random process. It's the bridge between abstract probability and concrete calculations — and the foundation for every distribution, expectation, and loss function in ML.

Distributions

The Normal, Bernoulli, and Binomial distributions are three fundamental shapes of randomness. They appear constantly in ML — as model outputs, noise models, and the implicit assumption behind loss functions.

Expectation

The expectation of a random variable is its long-run average — the value you'd expect if you repeated the experiment many times. It's the foundation of loss functions, gradient estimates, and reasoning about model performance.

Variance and Standard Deviation

Variance measures how spread out a distribution is — the average squared distance from the mean. Standard deviation is its square root, in the same units as the data. These underlie batch normalization, weight initialization, and uncertainty quantification.