# Things you should have learned from the MA283 Linear Algebra course by 10th March 2009

### Some terms you should now undersand

System of linear equations (Page 1)

Pivot variables and free variables (Page 80)

Linear combination (Pages 6 & 7 & 478)

Transpose of a matrix (Pages 49 & 480)

Inverse of a matrix (Page 45 & 478)

Trace of a matrix (Page 480)

Vector space (Page 69 & 480)

Vector subspace (Page 70 & 480)

Row operations or row exchanges (three basic one) (Page3 32-34)

Linear transformation or linear map (Page126)

Nullspace of a matrix and kernel of a linear transformation (Page 73)

Column space of a matrix and image of a linear transformation (Page 71)

Linearly independent set (Page 92)

The vector space spanned by a set of vectors (Page 94)

Basis for a vector space (Page 95)

Dimension of a vector space (Page 96)

Inner product (See lecture notes and Pages 143, 153 & 183)

Orthogonal vectors (Page 142)

Orthonormal set (Page 174)

Determinant of a square matrix (Pages 203-206)

Eigenvector and eigenvalue (Page 235)

### Some tasks you should now be able to perform

Express a system of n equations in m unknowns as a matrix equation.

Use row operations to solve a system of n linear equations in n unknowns.

Use row operations to calculate the inverse of a square matrix.

Use row operations to find the general solution of a system of n equations in m>n unknowns.

Write down an elementary matrix corresponding to an elementary row operation.

Determine whether a set of vectors is linearly independent.

Determine whether a set of vectors spans a given subspace W.

Determine the matrix representing a linear transformation f:V-->W with respect to given bases of V and W.

Given a matrix A representing a linear transformation f:V->W, calculate f(v) for some vector v in V.

Determine a basis for the kernel and the image of a linear map f:V-->W. The map could be given as a matrix with respect to the natural basis, or given by a formula.

Apply the Gram-Schmidt process to a linearly independent set E in order to produce an orthonormal set with the same span as E. The application could be with respect to the usual dot product in n-dimensional Euclidean space, or with respect to some inner product on the space of MxN matrices, or with respect to some inner product on a space of functions.

Given a set of vectors spanning a subspace W in Euclidean space, and given a vector v in Euclidean space, determine the vector in W “closest” to v.

### Some results you should now be able to prove

If a vector space is spanned by n vectors then any set of n+1 vectors in V is linearly dependent.

Any two bases of a finite dimensional vector space have the same number of elements.

The kernel of a linear transformation is a subspace.

The image of a linear transformation is a subspace.

Dim(Ker(f)) + Dim(Im(f)) = Dim(V) for a linear map f:V-->W.

The vectors produced by the Gram-Schmidt process are orthonormal.

Any set of eigenvectors associated to distinct eigenvalues is linearly independent.

### Other things you should know

You should understand the meaning of any result proved in the homework exercises and be able to reproduce the proof.

You should understand any calculation made in the homework exercises and be able to reproduce similar calculations.

You should be vaguely aware that orthogonormal basis are important for least square fits.

You should be vaguely aware that diagonal matrices are “good” for reprsenting maps.