Singular value decomposition (SVD) says that any matrix can be decomposed into a product of three different matrices, .

  • : some orthogonal matrix, where the columns are the left-singular vectors.
  • : an matrix where the first diagonal entries are the non-zero singular values of , where all other entries are .
  • : some orthogonal matrix, where the columns are the right-singular vectors.

We can alternatively express SVD in a different way, where and are the columns of and , respectively. Notice below why it’s useful.

We define the singular values of as the square roots of the eigenvalues of the symmetric matrix . We represent them in decreasing order.

Theorems

Every real matrix has a singular value decomposition, which makes SVD more broadly applicable than eigendecomposition.

Applications

The singular values of encode some interesting information. For example, a useful application of SVD is to compress data. Since the singular values continue to decrease, and many of the values are small relative to , we can save (maybe) the first 20 terms of instead of all the terms (of , which can be very large) and still come out with relatively high quality data.

We can also use SVD to compute the pseudoinverse of a non-square matrix, via the Moore-Penrose pseudoinverse algorithm.

Computations

In MATLAB, we have related functions:

  • S = svd(A) saves the singular values of A in descending order.
  • [U,S,V] = svd(A) performs the singular value decomposition of A and saves the corresponding matrices according to the above.

In NumPy, we can use np.linalg.svd(A), which returns a tuple with the attributes U, S, and Vh.