2 - Probability: Univariate Models

2 - Probability: Univariate Models#

2.1 - Introduction#

There are two different interpretations of probability:

  1. In the frequentist interpretation, probabilities represent the long run frequencies of events that can happen multiple times.

  2. In the Bayesian interpretation, probability is used to quantify our uncertainty about something.

2.2 - Random Variables#

2.2.1 - Discrete random variables#

2.2.2 - Continuous random variables#

2.2.4 - Independence and conditional independence#

2.2.5 - Moments of a distribution#

2.2.6 - Limitations of summary statistics *#

2.3 Bayes’ rule#

2.3.1 - Example: Testing for COVID-19#

2.3.2 - Example: The Monty Hall problem#

2.3.3 - Inverse problems *#

2.4 - Bernoulli and binomial distributions#

2.4.1 - Definition#

2.4.2 - Sigmoid (logistic) function#

2.4.3 - Binary logistic regression#

2.5 - Categorical and multinomial distributions#

2.5.1 - Definition#

2.5.2 - Softmax function#

2.5.3 - Multiclass logistic regression#

2.5.4 - Log-sum-exp trick#

2.6 - Univariate Gaussian (normal) distribution#

2.6.1 - Cumulative distribution function#

2.6.2 - Probability density function#

2.6.3 - Regression#

2.6.4 - Why is the Gaussian distribution so widely used?#

2.6.5 - Dirac delta function as a limiting case#

2.7 - Some other common univariate distributions *#

2.7.1 - Student t distribution#

2.7.2 - Cauchy distribution#

2.7.3 - Laplace distribution#

2.7.4 - Beta distribution#

2.7.5 - Gamma distribution#

2.7.6 - Empirical distribution#

2.8 - Transformations of random variables *#

2.8.1 - Discrete case#

2.8.2 - Continuous case#

2.8.3 - Invertible transformations (bijections)#

2.8.4 - Moments of a linear transformation#

2.8.5 - The convolution theorem#

2.8.6 - Central limit theorem#

2.8.7 - Monte Carlo approximation#

2.9 Exercises#