Basics of Linear Algebra¶
Linearity is the simplest structure in mathematics. Let's review the basic notations and terminologies in linear algebra.
where
The transpose of
In other words,
Note:
Definition(Sums and products of matrices). The sum of matrices
The product of matrices
The
We have
If it exists, the inverse of
The trace of a square matrix
We say
The rank of a matrix
Numpy Matrix Operations¶
Numpy has optimized the matrix operations based on the BLAS and LAPACK libraries. So in python, for the most of the time, using Numpy to do matrix operations is faster than writing your own code.
import numpy as np
A = np.array([[1, 2], [3, 4]])
B = np.array([[5, 6], [7, 8]])
# Identity matrix
I = np.eye(2)
# Matrix addition
C = A + B
# Matrix multiplication
D = A @ B
# Matrix entry-wise multiplication
E = A * B
# Matrix transpose
F = A.T
# Matrix inverse
G = np.linalg.inv(A)
Eigenvalues and Eigenvectors¶
Given two vectors
Definition(Eigenvalues and Eigenvectors). We say
Theorem(Eigenvalue Decomposition). A symmetric matrix
- Real eigenvalues:
- Orthonormal eigenvectors:
such that and for all
such that we have the eigenvalue decomposition:
Denote
The eigenvalue decomposition can also be written as
As a cornerstone of our chapter, we prefer to interpret the concepts as a solution of an optimization problem. Such interpretation is usually called variational form. The following theorem gives a variational form of eigenvalues.
Theorem(Variational Form of Eigenvalues). Given a symmetric matrix
and
and its minimum eigenvalue has:
and
Question: What is the variational form of other eigenvalues? See the visualization in the figure above.
The concept of eigenvalue decomposition can be generalized to non-symmetric or even non-square matrices. We have the following theorem on singular value decomposition.
Theorem(Singular Value Decomposition). Given a rank
- Singular values:
and denote - Orthogonal matrices:
and satisfying
such that we have the singular value decomposition (SVD):
We can see that the eigenvalue decomposition is a special SVD. The matrix
Given the concepts above, we can define the matrix operator norm.
Definition(Matrix Operator Norm). For a matrix
where
Connection between SVD and Eigenvalue Decomposition:
- The singular values of
are the square roots of the eigenvalues of : . - The left singular vectors of
are the same as the eigenvectors of . - The right singular vectors of
are the same as the eigenvectors of . - If
is symmetric, it will have real eigenvalues and eigenvectors. Assume , then . Note that we order the eigenvalues by their absolute values. - If
is positive semidefinite, it will have non-negative eigenvalues and eigenvectors and the SVD is the same as the eigenvalue decomposition (by ingoring the zero eigenvalues).