Its a form of matrix decomposition that decomposes a matrix into three distinct matrices.

where is a matrix (left singular values), is a matrix (right singular values), is a diagonal matrix of eigenvalues in descending order.

Low-Rank Approximation

The best rank- approximation of a matrix is given by keeping only the first largest singular values.

This means that no rank-k matrix can approximate matrix A better than a k-truncated SVD matrix

Applications

  • compression
    • can also compress neural network weights!
  • noise reduction
  • dimensionality reduction

How SVD is decomposed

  1. Compute
  1. Find eigendecomposition of
  1. Extract singular values
  1. Compute left singular values
  1. Assemble the decomposition

Example: Manual Calculation

Let’s compute SVD for a simple 2×2 matrix:

Step 1: Compute AᵀA

Step 2: Find eigenvalues

Step 3: Singular values

Step 4: Find eigenvectors of AᵀA for V

Step 5: Compute U

Result:

In practice, there are algorithmic implementations in like Numpy and PyTorch that you can use. The way above is not the standard way of doing it because eigendecomposition on very large matrices is very expensive

machineLearning