The Kullback-Leibler divergence (KL divergence) is able to measure the difference between two probability distributions and .

Cross entropy is a special case of KL divergence.

For two multivariable Gaussian distributions, we get: