Independence

Ref:

  1. NOTES ON PROBABILITY by Greg Lawler
  2. Probability_Theory by Kyle Siegrist
  3. The Book of Statistical Proofs

Independence of Events

Let A1,,An be n arbitrary statements about random variables X1,X2,,Xn with possible values X1,X2,,Xn ,

A1,,An is called statistically independent, if for allxiXi,i=1,,n: p(i=1nAi)=i=1np(Xi=xi)

Note: p(i=1nAi)=p(A1,,An)=p(X1=x1,,Xn=xn)

where p(X1=x1,,Xn=xn) are the joint probabilities of X1=x1,,Xn=xn and p(Xi=xi) are the marginal probabilities of Ai:Xi=xi.

Independence of Random Variables

Two random variables X and Y are independent <==> the elements of the π-system generated by them are independent; that is: P(Xx,Yy)=P(Xx)P(Yy)

Conditional Independence

As noted at the beginning of our discussion, independence of events or random variables depends on the underlying probability measure.

Thus, suppose that B is an event with positive probability. A collection of events or a collection of random variables is conditionally independent given B if the collection is independent relative to the conditional probability measure Ap(A|B).

For example, a collection of events p(i=1nAi) is conditionally independent given B if for every i: p(i=1nAi|B)=i=1np(Ai|B) Note: p(i=1nAi|B)=p(X1=x1,,Xn=xn|Y=y),i=1np(Ai|B)=i=1np(Xi=xi|Y=y),for allxiXiand allyY

where X1,X2,,Xn are discrete random variables with possible values X1,X2,,Xn; Y is discrete random variable with possible values Y. p(X1=x1,,Xn=xn|Y=y)=i=1np(Xi=xi|Y=y)