ball-blog
  • Welcome, I'm ball
  • Machine Learning - Basic
    • Entropy
    • Cross Entropy
    • KL-Divergence
    • Monte-Carlo Method
    • Variational Auto Encoder
    • SVM
    • Adam Optimizer
    • Batch Normalization
    • Tokenizer
    • Rotary Positional Encoding
    • Vector Quantized VAE(VQ-VAE)
    • DALL-E
    • Diffusion Model
    • Memory Layers at Scale
    • Chain-of-Thought
  • Einsum
  • Linear Algebra
    • Linear Transformation
    • Determinant
    • Eigen-Value Decomposition(EVD)
    • Singular-Value Decomposition(SVD)
  • AI Accelerator
    • CachedAttention
    • SGLang
    • CacheBlend
  • Reinforcement Learning
    • Markov
  • Policy-Improvement Algorithm
  • Machine Learning - Transformer
    • Attention is All you need
    • Why do we need a mask in Transformer
    • Linear Transformer
    • kr2en Translator using Tranformer
    • Segment Anything
    • MNIST, CIFAR10 Classifier using ViT
    • Finetuning PaliGemma using LoRA
    • LoRA: Low-Rank Adaptation
  • EGTR: Extracting Graph from Transformer for SGG
  • Machine Learning - Mamba
    • Function Space(Hilbert Space)
    • HIPPO Framework
    • Linear State Space Layer
    • S4(Structures Space for Sequence Model)
    • Parallel Scan Algorithm
    • Mamba Model
  • Computer System
    • Memory Ordering: Release/Acquire 1
    • Memory Ordering: Release/Acquire 2
    • BUDAlloc
    • Lock-free Hash Table
    • Address Sanitizer
  • App development
    • Bluetooth connection in linux
    • I use Bun!
    • Using Tanstack-query in Frontend
    • Deploying AI Service using EC2
  • Problem Solving
    • Bipartite Graph
    • Shortest Path Problem in Graph
    • Diameter of a Tree
  • Scribbles
Powered by GitBook
On this page
  • Singular-Value Decomposition
  • Orthogonal Matrix
  • The meaning of
  • How the calculation is done
  • References

Was this helpful?

Edit on GitHub
  1. Linear Algebra

Singular-Value Decomposition(SVD)

Last updated 2 months ago

Was this helpful?

Singular-Value Decomposition

The definition of SVD is as follows:

For rectangular matrix A∈Rm×nA \in \mathbb{R}^{m \times n}A∈Rm×n, it can be decomposed to following form:

Orthogonal Matrix U∈Rm×mU \in \mathbb{R}^{m \times m}U∈Rm×m Orthogonal Matrix V∈Rn×nV \in \mathbb{R}^{n \times n}V∈Rn×n Diagonal Matrix Σ∈Rm×n\Sigma \in \mathbb{R}^{m\times n}Σ∈Rm×n

A=UΣVT ...(1)A=U\Sigma V^T \ ...(1)A=UΣVT ...(1)

Orthogonal Matrix

Orthogonal matrix is a square matrix X∈Rn×nX \in \mathbb{R}^{n \times n}X∈Rn×n that has following feature:

XXT=IXX^T=IXXT=I

The meaning of U,Σ,VU, \Sigma, VU,Σ,V

To understand the meaning of U,Σ,VU, \Sigma, VU,Σ,V we should multiply ATA^TAT to equation (1).

AAT=UΣVT(UΣVT)T =UΣVTVΣTUT =U(ΣΣT)UT =U(ΣΣT)U−1AA^T=U\Sigma V^T (U\Sigma V^T)^T \\~\\ = U\Sigma V^T V \Sigma^T U^T \\~\\ = U(\Sigma \Sigma^T) U^T \\~\\ = U(\Sigma \Sigma^T)U^{-1}AAT=UΣVT(UΣVT)T =UΣVTVΣTUT =U(ΣΣT)UT =U(ΣΣT)U−1

This is the exact form of EVD(Eigen-Value Decomposition).

How the calculation is done

References

UUU is the matrix of eigen-vectors of AATAA^TAAT. Vice-versa, VVV is the matrix of eigen-vectors of ATAA^TAATA. ΣΣT\Sigma \Sigma^TΣΣTis a diagonal matrix that each diagonal elements are eigen-value of AATAA^TAAT.

Since we know the meaning of U,Σ,VU, \Sigma, VU,Σ,V, we can easily decompose AAA.

Get eigen-vectors of AATAA^TAAT and build-up UUU

Get eigen-values of AATAA^TAAT and build-up Σ\SigmaΣ

Calculate the remaining VVV using inverse.

[1]

https://youtu.be/mBcLRGuAFUk?si=Ld1YyxMeRy2el9-a