In statistics, the sample mean is the sum of all the data values divided by the number of data values in the population. Its formula is given by:

If we’re dealing with the whole population of data, we can also represent the mean with .

Properties

The expected value also has several helpful properties. For a constant :

And we can break up the expected value of sums, into a sum of expected values:

Probability distributions

For probability distributions, we refer to the mean as the expected value. We keep the same summation principle. We can think of as the centre-of-mass, if is a distribution of mass on the real line.

In the continuous case:

In the discrete case:

The expected value is only defined if the integral or sum converges absolutely. There are certain RVs for which they don’t converge (Zipf, Pareto, Cauchy).

We also define the th moment of the random variable as:

We define the conditional expected value as:

Multiple random variables

If the random variables are pairwise independent, then:

The conditional expected value of a random variable given is:

The expected value of a multi-RV function is given by:

In the multiple RV case, the sums break apart as in the single RV case.

The th joint moment of and is given by:

See also