Eigen Values and Eigen Vectors
Let's say we have matrix A∈Rn×n.
The solutions x∈Rn×1 and λ∈R for the following equation are called eigen vectors and eigen values.
Eigen-Value Decomposition
We are trying to decompose square-matrix A∈Rn×n.
Let's define a notation Λ as diagonal matrix that diagonal elements are eigen-values.
Due to the definition of eigen-values and eigen-vectors, we can write the equation as following:
Ax1=λx1Ax2=λx2Ax3=λx3... A∣x1∣∣x2∣∣x3∣...∣xN∣=∣x1∣∣x2∣∣x3∣...∣xN∣λ100000λ200000λ300000...00000λN If we apply inverse of X to each side, we get the following equation:
A=XΛX−1 Special Case when A is symmetric square Matrix
There is a special case for EVD.
If Matrix A is symmetric and square and have distinct eigen-values, then we can express EVD as follows:
A=XΛXT Proof of the special case
We just have to prove that X−1=XT, which means X is orthogonal matrix.
λ1v1⋅v2=Av1⋅v2=Av2⋅v1=λ2v2⋅v1=λ2v1⋅v2 ...(1) From equation (1), we can derive following:
λ1v1⋅v2−λ2v1⋅v2=(λ1−λ2)v1⋅v2=0 Since we assumed matrix A have distinct eigen-values, we can derive v1⋅v2=0
So we can say that X−1=XT
References