In probability theory, events are independent if one event occurring does not affect the probability of another event occurring. One basic axiom is that if events are independent, then this relation must hold:

Random variables

Multiple random variables

and are independent random variables if any event is independent of any event , i.e.,:

If and are independent, this implies that:

i.e., the joint PMF is equal to the product of the marginal PMFs.

The same holds true for the continuous case. For independent random variables , their joint CDF is equal to the product of its marginal CDFs.

They’re also independent if and only if their joint pdf is equal to the product of the marginal pdfs:

For a random variable with mean , for independent, repeated measurements of : , we denote as independent, identically distributed (iid) random variables with the same pdf as . We can use the sample mean of the sequence to estimate the value of .