Optimization 1  Linear Algebra Review
Linear Algebra Review Notes
Or, everything you ever needed to know from linear algebra.
Vectors
Elements in are Vectors. A vector can be thought of as real numbers stacked on top of each other (column vectors).
Properties of Vectors
Let be vectors in , and a scalar in . Let be the angle between vectors A and B.

is the Transpose of a vector.

Vector Addition:
 Scalar Multiplication:
 The scalar product between two vertices, or Dot Product:
The scalar product is also written as or as . The scalar product is Symmetric. That is,
 The vector Norm or length of a vector is defined by
There are different “kinds” of norms. This particular norm is called the Euclidean Norm for a vector. More information on norms can be found here.
 CauchyBunyakowskiSchwarz Inequality: The following inequality holds:
 Triangle Inequality: The following inequality holds:
 The cosine of the angle between vectors and is
 means the vectors and are Orthogonal. The following is true:
Definitions of Vertices
 A linear subspace is a set enjoying the following properties:
 For all , it holds that
 For all , it holds that
 An affine subspace is any set that can be represented as , for some vector in and linear subspace
 A set of vectors is said to be Linearly Independent if and only if
There can be at most linearly independent vectors in . Orthogonal vectors are also linearly independent. Linearly independent vectors are not necessarily orthogonal.
 Any collection of linearly independent vectors is called a Basis.
 A basis is said to be Orthogonal Basis if for all vectors in the collection
 A basis is said to be an Orthonormal Basis if
 The Standard Basis is the collection of orthonormal vectors such that . The vectorspace is equipped with the standard basis.
Matrices
Consider two spaces and . All linear functions from to may be described using a linear space of Real Matrices .
Properties of Matrices
Let and , and a scalar in . Let and
 is the Transpose of a matrix.
 .
 .
 .
 Matrix Addition:
 Scalar Multiplication:
 Matrix Multiplication: Let and . Then
Matrix multiplication is tricky and should be seriously practiced. Matrix multiplication is Associative (meaning ), but it is NonCommutative (meaning may not equal ).

Let be a vector. Than where

The Norm of a matrix is defined by
 There are different “kinds” of matrix norms. This norm is the Frobenius, or 2norm. More information on norms can be found here.
 .
 Triangle Inequality:
Matrix Definitions
 The Range or Range Space of a matrix is defined as
 The Kernel or Null Space of a matrix is defined as
The kernel and the range of a matrix are subspaces of
 The Rank of a matrix is defined as the minimum of the dimensions of the range space of and the range space of .
 .
 A matrix is said to have Full Rank if the rank of equals
 The rank of a matrix is also equal to the number of Pivot Points it has when reduced. This video explains more.
Square Matrices
Square Matrices have additional unique properties. Let
 The determinate of a square matrix is a special number with unique properties. The determinate of a matrix is calculated by
calculating the determinate of larger matrices is more complex, but a nice tutorial can be found here.
 There exists the Identity Matrix such that
 If holds for some vector and scalar , then we call a Eigenvector of , and we call an Eigenvalue of .
 The set of Eigenvectors of , corresponding to their eigenvalues, form a linear subspace of
 Every matrix has eigenvalues (counted with multiplicity). Eigenvalues may be complex.
 The sum of the eigenvalues of is the same as the Trace of (that is, the sum of the diagonal elements of ).
 The product of the eigenvalues of is the determinate of
 The roots of the Characteristic Equation are the eigenvalues of A.
 The norm of A is at least as large as the largest absolute value of it’s eigenvalues.
 The eigenvalues of are the inverses of the eigenvalues of
 for square matrices, the eigenvalues of are the eigenvalues of
 For a square matrix there can exist an Inverse Matrix , with
 A invertible square matrix is called Nonsingular. Nonsingular matrices have the following properties, and the following properties imply a matrix is nonsingular:
 There exists it’s inverse,
 The rows and columns of are linearly independent.
 The system of linear equations has a unique solution for all .
 The homogenous system of equations has only one solution, .
 is nonsingular.
 .
 None of the eigenvalues of are 0.
 The set of all nonsingular square matrices forms a Ring. This ring is noncommutative, and is therefore closed under addition and matrix multiplication.
 We call Symmetric if and only . Symmetric matrices have the following properties:
 .
 Eigenvalues of symmetric matrices are real.
 Eigenvectors corresponding to distinct Eigenvalues are orthogonal to each other.
 The norm of a symmetric matrix equals the largest absolute value of it’s eigenvalues.
 Even if is a nonsquare matrix, and are square symmetric matrices.
 If the columns of are linearly independent, then is nonsingular.
 If the columns of are linearly independent, then is nonsingular.
 The eigenvalues of and are the same, and are nonnegative.
 We call a Positive Definite matrix if and only if for all , . We represent this with
 Respectively, we call a Negative Definite matrix, and if and only if
 Respectively, we call a Negative Semidefinite matrix, and if and only if
 Respectively, we call a Positive Semidefinite matrix if, and and only if
 is positive definite if and only if it has positive eigenvalues
 Respectively is negative definite if and only if it has negative eigenvalues
 Respectively is positive semidefinite if and only if it has nonnegative eigenvalues
 Respectively is negative semidefinite if and only if it has nonpositive eigenvalues
 For symmetric matrices and , if and only if
 Respectively if and only if
 The sum of two positive definite matrices is also positive definite.
 Every positive definite matrix is nonsingular and invertible, and it’s inverse is also positive definite
 If is positive definite and real , then is positive definite
 Let be a symmetric matrix. Then the determinant
for is called the kth principal minor of A.
 If is a symmetric matrix, and if is the kth principal minor for then
 is positive definite if
 is negative definite if . That is, the principal minors alternate sign starting with
 If either of the above conditions applied for all but , then is semidefinite.
###Sources