What is Random Variables?

If we flip a coin 10 times, how many times would heads appear? or what’s the longest run of tails? These questions or functions of outcomes are known as random variables.

A random variable is a function X : \( \Omega \rightarrow \mathbb{R} \). There are two types of random variables:

  1. Discrete – When the random variable takes a finite set of values. For example, the number of heads in a sequence of w tosses.

  2. Continuous – When the random variable takes a infinite set of values. For example, the amount of time it takes for a radioactive particle to decay.

When describing the event that a random variable takes on a certain value, we use indicator function, which takes value 1 when event A happens and 0 otherwise as shown below:

\(1\{X > 3\} = \left\{ \begin{array}{cols} 1, if X > 3 \\ 0, otherwise \end{array}\right.\)

Cumulative Distribution Functions (CDF)

It is often convenient to use other alternative functions to measures probability when dealing with random variables. One such functions is cumulative distribution functions (CDF). CDF is a function \(F_X(x) : \mathbb{R} \rightarrow [0, 1] \), which allows us to calculate the probability of \(F_X(x) = P( X \leq x)\).

Probability Mass Functions (PMF)

When X is a discrete random variable, we can specify the probability of each value that the random variable can assume. In PMF, \(p_X(x) : \Omega \rightarrow \mathbb{R}\) such that \(p_X(x) = P(X = x) \).

Val(X) is the set of possible values X can take. If X(w) is a random variable for the no. heads out of 10 tosses, then Val(X) = {0, 1, 2,…., 10}.

Probability Density Functions (PDF)

For some continuous random variables, the CDF is differentiable everywhere. In this case, we can compute the PDF which it’s the derivative of CDF: \(f_X(x) : \frac{dF_X(x)}{dx}\).

Note that PDF does not always exist as CDF may not be completely differentiable.

Expectation

For discrete random variables: \( \mathbb{E}[g(X)] : \sum_{x \in Val(X)} g(x)p_X(x)\).

For continuous random variables: \( \mathbb{E}[g(X)] : \int_{-\infty}^{\infty} g(x)f_X(x)dx\).

Essentially, the expectation can be seen as the weighted average of all the values that g(x) can take on for different values of x and the weights are given by different probability functions dependent on the type of random variables.

Variance

It measures how concentrated the distribution of X is around its mean. The formula to compute that is as follows:

\( Var[X] = \mathbb{E}[(X – \mathbb{E}[X])^2] \).

which can be equate to:

\( Var[X] = \mathbb{E}[X^2] – \mathbb{E}[X]^2\).

List of common random variables

Discrete random variables
  • Bernoulli distribution
  • Binomial distribution
  • Geometric distribution
  • Poisson distribution
Continuous random variables
  • Uniform distribution
  • Exponential distribution
  • Normal / Gaussian distribution
Ryan

Ryan

Data Scientist

Leave a Reply