Expected Value, Variance and Covariance of a Random Variable

Sources:

Notation

  • The expected value of a random variable X is often denoted by E(X), E[X], EX, with E also often stylized as E or E, or symbolically as μX or simply μ.
  • The variance of random variable X is typically designated as Var(X), or sometimes as V(X) or V(X), or symbolically as σX2 or simply σ2 (pronounced "sigma squared").
  • The covariance of of two random variables X and Y is typically designated as Cov(X,Y).

Expected value

In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of the possible values a random variable can take, weighted by the probability of those outcomes.


For a discrete random variable X with sample space (or alphabet) X, the expectation of X is then given by the weighted average E[X]=i=1xiP(X=xi) where xiX.


For a continuous random variable X with probability density function given by a function f, the expectation of X is then given by the integral E[X]=xf(x)dx.

Linearity of expectation

The expectation is a linear operator, i.e., E[aX+bY]=aE[X]+bE[Y]. This can be easily proved by LOTUS.

Variance

The variance of a random variable X is the expected value of the squared deviation from the mean of X : Var(X)=E[(XE[X])2].

This definition encompasses random variables that are generated by processes that are discrete, continuous, neither, or mixed.

The variance can also be thought of as the covariance of a random variable with itself: Var(X)=Cov(X,X)

One important property is that, Var(X)=E[(XE[X])2]=E[X22XE[X]+E[X]2]=E[X2]2E[X]E[X]+E[X]2=E[X2]2E[X]2+E[X]2=E[X2]E[X]2

Covariance

Covariance in probability theory and statistics is a measure of the joint variability of two random variables.

For two jointly distributed real-valued random variables X and Y, the covariance is Cov(X,Y)=E[(XE[X])(YE[Y])] ## Expectation --> Covariance

One important property is that Cov(X,Y)=E[(XE[X])(YE[Y])]=E[XYXE[Y]E[X]Y+E[X]E[Y]]=E[XY]E[X]E[Y]E[X]E[Y]+E[X]E[Y]=E[XY]E[X]E[Y].

What if two random variables are independent?

Suppose X,Y are independent, we have Cov(X,Y)=E[(XE[X])(YE[Y])]=E[XY]E[X]E[Y]=E[X]E[Y]E[X]E[Y]=0.