# Review Sheet for Math 225,Section M1 Final Exam

**1. Linear System of Equations.** Know how to:

• write them in vectorial and matrix form, see Sections 1.3 and 1.4;

• solve them using:

– row reduction, see Section 1.2;

– inverse of the matrix of the system (works only when the number of the
unknowns equals the

number of equations), see Section 2.2;

– Cramer rule (works only when the number of the unknowns equals the number of
equations),

see Section 3.3;

• write their solutions in (parametric) vector form, see Section 1.5.

You should also know when the linear system of equations (in matrix form) Ax = b
:

• has no solution (inconsistent), or, on the contrary, has at least one solution
(consistent) and how

are these situations related to the span of the columns of A, or to ColA, see
Sections 1.2, 1.4 and

4.2;

• has at most one solution, or, on the contrary, infinitely many solutions and
how are these situations

related to: the linear independence/dependence of the columns of A, or NulA, or
the dimension

NulA, see Sections 1.7, 4.2, 4.5.

**
2. Matrix Operations and Invertible Matrices.** Know:

• how to add two matrices, how to multiply a matrix by a number (scalar), how to multiply two

matrices, how to take the transpose of a matrix, and the properties of all these operations, see

Section 2.1;

• the definition of an invertible matrix, see Section 2.2;

• how to use invertible matrices to solve systems of equations, see Theorem 5 in Section 2.2;

• how to compute the inverse of a matrix:

– for a 2 × 2 matrix, see Theorem 4 in Section 2.2;

– via row operations (row reduction), see Section 2.2;

– using determinants, see Section 3.3;

• properties of the inverse of a matrix, see Theorem 6 in Section 2.2;

• how to characterize the invertibility of a square matrix A via:

– linear independence of columns of A, or the number of solutions of Ax = 0, or the NulA, or

the dimension of NulA, see Theorem 8 in Section 2.3 and Theorem in Section 4.6;

– consistency of Ax = b, or the span of columns of A, or ColA, or the dimension of ColA, or

rankA, see Theorem 8 in Section 2.3 and Theorem in Section 4.6;

– the determinant of A, see Theorem 4 in Section 3.2.

– the invertibility of the transpose of A, see Theorem 8 in Section 2.3.

3.

**Determinants**. Know:

• the definition of the determinant of a 1 × 1 matrix, 2 × 2 matrix, larger matrices via cofactor

expansions, see Sections 3.1;

• how a determinant changes after a row operation or when taking the transpose, or after multiplying

two matrices, see Section 3.2;

• how to calculate determinants using row, column operations and cofactor expansions, in particular

how to compute determinants of upper or lower triangular matrices;

• how to use determinants to:

– establish whether a matrix is invertible, see Theorem 4 in Section 3.2;

– solve linear system of equations using the Cramer rule, see Section 3.3;

– calculate the inverse of a matrix, see Section 3.3;

– calculate the area of a parallelograms or volumes of parallelepipeds, see
Section 3.3;

4. **Vector Spaces**. Know:

• the definition of a vector space, properties of the operations in a vector
space, definition of span,

see Section 4.1;

• the definition of a subspace, how to check whether given sets are subspaces in
particular why for

an m×n matrix A, NulA and ColA are subspaces of R^{n} respectively R^{m}, see Sections
4.1 and 4.2;

• the definition of linear dependence and linear independence of a set vectors
and how to check

whether a given set of vectors is linearly dependent/independent, in particular
the relation between

linear dependence and independence of vectors in R^{n} and the number of solutions
of the equation

Ax = 0, see Sections 4.3 and 1.7;

• the definition of a basis in a vector space, the definition of the dimension
of a vector space, how to

find basis and the dimension of NulA and ColA where A is an m × n matrix, see
Section 4.3 and

4.5;

• the definition of the rank of a matrix, the Rank Theorem and its applications
to systems of linear

equations and to establishing the invertibility of a matrix, see Section 4.6;

5. **Eigenvalues, Eigenvectors and Diagonalization of Square Matrices **Know:

• the definition of an eigenvalue and an eigenvector, see Section 5.1;

• how to compute the eigenvalues and the corresponding eigenspaces for given
square matrix, see

section 5.2;

• when is a square matrix diagonalizable and how its eigenvalues and
eigenvectors are used to diagonalize

a matrix, see section 5.3;

• how to use diagonalization in computing the powers of a square matrix, see
Section 5.3.

6.** Inner Product, Distance, Orthogonality and Least Square Problems** Know:

• the definition of inner product of vectors in R^{n} and its properties, the
definitions of the length of

a vector and distance between two vectors, see Section 6.1;

• the definition of orthogonality and Pytagora’s Theorem, see Section 6.1;

• the definition of orthogonal sets of vectors in R^{n} and their linear
independence, see Section 6.2;

• the definition and how to calculate the orthogonal projection of a vector onto
a subspace of R^{n},

see Section 6.3;

• the definition of least squares solutions and how to calculate them, see
Theorem 13 in Section 6.5.