## 01:640:350:02 Linear Algebra Section

Woodward, Christopher |
L | 02403 | 640 | 350 | 02 | MW5 | 0320 P - 0440 | SEC: 206 |
BUS |

- This course is a proof-based continuation of Math 250, covering Abstract vector spaces and linear transformations, inner product spaces, diagonalization, and canonical forms.

*Prerequisites: *

- CALC4, Math 250 and Math 300

Text: *Linear Algebra* (4th ed.), by Friedberg, Insel and Spence,

Prentice Hall, 2003 ISBN 0-13-008451-4.

- Lectures MW5 (3:20-4:40PM) in
**SEC: 206** - Office Hours: Wednesday 2-3pm, Hill 726
- Contact Information: e-mail ctw@math.rutgers.edu

The course is strongly based on Math 250. However, we'll work axiomatically, starting from the abstract notions of vector space and of linear transformation. Much of the homework and many of the exam and quiz problems will require you to write precise proofs, building on your proof-writing experience in Math 300. From this more abstract viewpoint, we'll be developing linear algebra far beyond Math 250, with new insight and new applications.

Class attendance is very important. A lot of what we do in class will involve collective participation. We will cover the topics indicated in the syllabus below, but the dates that we cover some of the topics might be adjusted during the semester, depending on class discussion, etc. Such adjustments, along with the almost-weekly homework assignments, will be announced in class and also posted on this webpage, so be sure to check this webpage regularly. Absences from a single class due to minor illnesses should be self-reported using the university system; for longer absences, students should email me with the situation. I reserve the right to lower the course grade up to one full letter grade for poor attendance.

Make-ups for exams are generally not given; if a student has an extremely good reason (e.g. documented medical emergency) I may re-arrange the grading scheme to accomodate.

Problem sets are due on most Wednesdays. There are no problems due on the two midterm-exam Wednesdays.

Note that we will cover significant material from all the chapters in the book, Chapters 1-7, but we will cover Chapter 7 before Chapter 6. This is because the material in Chapter 7 is a natural continuation of the material in Chapter 5 on the theory of eigenvalues, eigenvectors and diagonalizability. Chapter 6 also concerns eigenvalues, eigenvectors and diagonalizability, but this time, based on a generalization of the theory of dot products.

Quizzes will be given at the ends of a few class sessions. The dates of these quizzes, and the topics covered, will be announced in advance.

Grading policy: First midterm exam: 100 points; Second midterm exam: 100 points; Homework and quizzes: 100 points; Final exam: 200 points (Total: 500 points).

#### Tentative Course Syllabus

Week | Lecture dates | Sections | topics |
---|---|---|---|

1 | 9/6 (W) | Chapter 1 | Abstract vector spaces & subspaces |

2 | 9/11, 9/13 | Chapter 1 | Span of subsets, linear independence |

3 | 9/18, 9/20 | Chapter 1 | Bases and dimension |

4 | 9/25, 9/27 | Chapter 2 | Linear transformations |

5 | 10/2, 10/4 | Chapter 2 | Change of basis, dual spaces |

6 | 10/9, 10/11 | Ch. 1-2 | Review and Exam 1 (10/11) |

7 | 10/16, 10/18 | Chapter 3 | Rank and Systems of Linear Equations |

8 | 10/23, 10/25 | Chapter 4 | Determinants and their properties |

9 | 10/30, 11/1 | Chapter 5 | Eigenvalues/eigenvectors |

10 | 11/6, 11/8 | Chapter 5 | Diagonalization, Markov Chains |

11 | 11/13, 11/15 | Chapter 6 | Inner Product spaces |

12 | 11/20 | Chapter 6 | Unitary and Orthogonal operators |

13 | 11/27, 11/29 | Ch.3,4,5,7 | Review and Exam 2 (11/29) |

14 | 12/4, 12/6 | Chapter 7 | Jordan Canonical Form |

15 | 12/11, 12/13 | Chapter 7 | Rational Canonical Form (last class) |

17 | 12/22 (Friday) | 12-3 PM | Final Exam |

#### Exam Dates

The exam dates are listed in the schedule above. Any conflict (such as with a religious holiday) should be reported to me at the beginning of the semester, so that the exam may be re-scheduled.

### Special Accommodations

Students with disabilities requesting accommodations must follow the procedures outlined at https://ods.rutgers.edu/students/applying-for-services

### Academic Integrity

All Rutgers students are expected to be familiar with and abide by the academic integrity policy. Violations of the policy are taken very seriously. In particular, your work should be your own; you are responsible for properly crediting all help with the solution.

#### Problem Set 1

(Problems in pdf. Solutions in pdf.)

(1) Find the space of polynomials \(f(x) = \{ ax^2 + bx + c \}\) satisfying \(f(-1) = -1, f'(-1) = -1\).

(2) Show that the set of functions \(f: S \to \mathbb{R}\) from a set \(S\) to the real numbers \(\mathbb{R}\) with addition given by \((f + g)(x) = f(x) + g(x)\) and scalar multiplication given by \((cf)(x) = c(f(x))\) satisfies axiom (VS4) in the axioms for a vector space.

(3) Show that the set of polynomials in a single real variable with addition given by \((f + g)(x) = f(x) + g(x)\) and scalar multiplication \((cf)(x) = f(cx)\) is not a vector space.

(4) Show that if \(V\) is a vector space and \(v \in V\) and \(c \in F\) are such that \(cv = 0\) then either \(c = 0\) or \(v = 0\). [Hint: argue by cases or by contradiction.]

(5) Prove that \( \{ (1,1,0), (1,0,1), (0,1,1) \}\) is linearly independent over \(\mathbb{R}\) but linearly dependent over \(\mathbb{Z}_2\).

#### Problem Set 2

(Problems in pdf. Solutions in pdf.)

(1) Prove that a nonempty subset \(W\) of a vector space \(V\) is a subspace iff \(\operatorname{span}(W) = W\).

(2) Prove that the polynomials \(1, x, x^2\) are linearly independent in the space of functions from \(\mathbb{R}\) to \(\mathbb{R}\).

(3) Find a basis for the space of real polynomials \(\{ f(x) = ax^3 + bx^2 + cx + d \ | \ f(1) = 0 \}\).

(4) Find a basis for the vector space of real polynomials of arbitrary degree, and prove that your answer is a basis. [Warning: linear independence is a bit tricky.] Show that this space is infinite dimensional.

(5) Show that if \(W\) is a subspace of a finite-dimensional vector space \(V\) and \(\dim(W) = \dim(V)\) then \(V = W\).

(6) Find a basis for the space of real skew-symmetric matrices of size \(n\). What is the dimension of the space?

#### Problem Set 3

(Problems in pdf. Solutions in pdf.)

(1) Show that the space of convergent sequences of real numbers is an infinite-dimensional subspace of the space of all sequences of real numbers. [Hint: exhibit an linearly independent subset of infinite size.]

(2) Show that if \(T: V \rightarrow W\) and \(S: W \rightarrow U\) are linear transformations then the composition \(S \circ T: V \rightarrow U\) is also a linear transformation.

(3) Suppose that \(T:\mathbb{R}^2 \rightarrow \mathbb{R}^2\) is a linear transformation. If \(T(1,0) = (2,3)\) and \(T(0,1) = (3,4)\), find \(T(2,1)\). Justify your answer.

(4) Show that the function \(T:\mathbb{R}^2 \rightarrow \mathbb{R}^2\) defined by \(T(x,y) = (|x|,|y|)\) is not a linear transformation.

(5) Let \(T: P_2(R) \rightarrow P_2(R)\) (here \(P_2(\mathbb{R})\) is the space of polynomials in a real variable of degree at most 2) be the linear transformation defined by \((T(f))(x) = x f'(x)\). Show that \(T\) is a linear transformation and find its nullspace and range.

#### Problem Set 4

(Problems in pdf. Solutions in pdf.)

(1) Find the coordinates of the given vector with respect to the given basis.

(a) \(V = \mathbb{R}^2, B = \{ (1,0), (0,1) \}, v = (1,2)\).

(b) \(V = \mathbb{R}^2, B = \{ (1,1), (1,-1) \}, v = (1,2)\).

(c) \(V = P_2, B = \{ 1, x, x^2 \}, v(x) = (x + 1)^2 \).

(d)

\(V\) the span of \(\sin(x),\cos(x)\) in the space of functions of a real variable \(x\),

\(B = \{ \sin(x), \cos(x) \}\), \(v(x) = \sin(x + 1)\).

(2) Find the matrix of the given linear transformation with respect to the given bases.

(a) \(T: \mathbb{R}^2 \to \mathbb{R}^2, (x,y) \mapsto (-y,x), B = B' = \{ (1,0), (0,1) \}\).

(b) \(T: P_2 \to P_2, B = B' = \{ 1,x,x^2 \}, (Tf)(x) = f(x+1) \).

(c) \(T: V \to V\) where \(V\) is the span of \(\sin(x),\cos(x)\) in the space of functions of a real variable \(x\), \(B = B' = \{ \sin(x), \cos(x) \}\), \((Tf)(x) = f(x + \pi/2) \).

(d) \(T: P_1 \to P_2\), \(B = \{ 1, x \}\), \(B' = \{ 1, x, x^2 \}\), \((Tf)(x) = \int_{0}^x f(t) dt \).

(3) Show that if \(S: V \to W\) and \(T: W \to U\) are linear transformations then

(a) If \(S,T\) are one-to-one then \(T \circ S\) is one-to-one.

(b) Is \(T \circ S\) is one-to-one then \(S\) is one-to-one.

#### Problem Set 5

(Problem in pdf.)

(1) Let \(T: V \to W\) be a linear transformation and \(B \subset V\) a basis. Prove that \(T\) is an isomorphism if and only if \(T(B) := \{ T(b), b \in B \}\) is a basis for \(W\).

(2) In each case find the \(3 \times 3\) elementary matrix \(E\) so that multiplying on the left by \(E\) has the desired effect.

(a) Switching rows \(2\) and \(3\).

(b) Multiplying row \(3\) by \(5\).

(c) Subtracting \(4\) times row \(3\) from row \(1\).

(3) Write the matrix \( \left[ \begin{array}{lll} 0 & 1 & 0 \\

0 & 0 & 2 \\

3 & 0 & 0 \end{array} \right] \)

as the product of elementary matrices.

(4) Find the inverse of the matrix in \(3\) and write it as a product of elementary matrices.

(5) Suppose that \(T: V \to W\) is an isomorphism of vector spaces.

(a) If \(Z \subseteq V\) is a subspace, show that \(\dim(T(Z)) = \dim(Z)\). Here \(T(Z) := \{ T(z) | z \in Z \}\).

(b) Suppose that \(S: U \to V\) is another linear transformation. Show that \(rank(T \circ S)= rank(S)\).

(c) Suppose \(A\) and \(B\) are matrices related by an elementary row operation. Do \(A\) and \(B\) have the same rank, that is, are their columns spaces the same dimension? Why or why not?

(6)

(a) Use Gaussian elimination to solve the system of linear equations

\(x_1 + 2x_2 - x_3 + 3x_4 = 2\)

\(2x_1 + 4x_2 - x_3 + 6x_4 = 5\)

\(x_2 + 2x_4 = 3.\)

Show your work and for each step indicate the elementary row

operation.

(b) Use Gaussian elimination to find the inverse of the matrix \(\left[ \begin{array}{lll} 1 & 2 & 1 \\

-1 & 1 & 2 \\

1 & 0 & 1 \end{array} \right] \). For each step indicate which

elementary row operation you used.

### Problem Set 6

(Problem in pdf)

(1) (a) Let \(T\) be a triangle in the plane whose vertices have all

integer components. Show that the area of \(T\) is at least \(1/2\).

(b) Find a triangle in the plane whose vertices have all integer components, whose vertices include \((0,0)\) and

\((3,5)\), and whose area is \(1/2\).

(2)

Find the determinant of

\( \left[ \begin{array}{lll} 1 & 2 & 4 \\

1 & 3 & 9 \\

1 & 4 &

16 \end{array} \right] \)

using

(a) cofactor expansion along the second row.

(b) row-reduction.

(3) (a) More generally for any \(x_1,x_2,x_3,y_1,y_2,y_3\) with

\(x_1,x_2,x_3\) distinct find the determinant of

\( \left[ \begin{array}{lll} 1 & x_1 & x_1^2 \\

1 & x_2 & x_2^2 \\

1 & x_3 & x_3^2

\end{array}

\right] \)

%

using row reduction.

(b) Show that given three points \((x_1,y_1),(x_2,y_2), (x_3,y_3)\) with

distinct \(x_1,x_2,x_3\), there is a unique parabola

\(f(x) = a + bx + cx^2\) passing through the three given points, that

is, with \(f(x_k) = y_k\) for \(k = 1,2,3\).

(4) (a) A permutation matrix is a matrix with exactly one \(1\) in each

row and column and the remaining entries all equal to \(0\). Show that

the determinant of any permutation matrix is \(1\) or \(-1\).

(b) An antisymmetric matrix is a matrix \(A\) with the property that

\(A = -A^T\). Show that the determinant of any \(5 \times 5\)

skew-symmetric matrix is zero.

(5) (a) Let \(u,v,w\) be vectors in \(R^3\). Show that the matrix with columns

\(u-v, v-w, w-u\) has zero determinant.

(b) Suppose that the matrix whose columns are \(u,v,w\) has determinant

\(2\). What is the determinant of the matrix whose columns are

\(u+v,v+w,w+u\)?

### Problem Set 7, Math 350, Fall 2017

(Problems in pdf)

(1) Find the eigenvalues and eigenvectors for the following matrices.

For each eigenvalue, identify the (algebraic) multiplicity and

geometric multiplicity.

(a)

\( \left[ \begin{array}{lll}

0 & 1 & 1 \\

1 & 0 & 1 \\

1 & 1 & 0

\end{array} \right]\)

(b)

\( \left[ \begin{array}{lll}

1 & 1 & 1 \\

0 & 0 & 1 \\

0 & 0 & 1

\end{array} \right]\)

(c)

\( \left[ \begin{array}{lll}

0 & 1 & 0 \\

0 & 0 & 1 \\

1 & 0 & 0

\end{array} \right]\)

(d)

\( \left[ \begin{array}{lll}

0 & 1 & 0 \\

0 & 0 & 1 \\

8 & 0 & 0

\end{array} \right]\)

(e)

\( \left[ \begin{array}{llll}

1 & 1 & 0 & 0 \\

1 & -1 & 0 & 0 \\

0 & 0 & 1 & 1 \\

0 & 0 & 1 & 0

\end{array} \right]\)

(2) Suppose that companies App, Mot and Sam are competing for

smart-phone customers according to the following rule: each year, App

loses ten percent of its customers each to Mot and Sam; Mot loses

thirty percent of their customers each to App and Sam; and Sam loses

thirty percent of its customers each to App and Mot. Suppose at time

\(t = 0\), there are 100 customers with Mot and none with the other companies. Find a formula for the

number of customers with App as a function of time \(t\). What happens

as time \(t \to \infty\)?

(3) (a) Prove that if \(\lambda\) is an eigenvalue for an invertible

matrix \(A\), then \(\lambda\) is non-zero and \(1/\lambda\) is an

eigenvalue for \(A^{-1}\).

(b) Prove that two eigenvectors \(v_1,v_2\) for a skew-symmetric matrix

\(A\) with distinct eigenvalues \(\lambda_1,\lambda_2\) are orthogonal, that is, satisfy \(

\overline{v_1} \cdot v_2 = 0\). (Hint: consider \(\overline{v_1} \cdot A v_2\) and \(\overline{Av_1}

\cdot v_2\). )

(4) Prove that the eigenvalues of a permutation matrix are roots of

unity, that is, each eigenvalue \(\lambda\) satisfies \(\lambda^n = 1\) for some \(n\).

(5) Consider the Fibonacci sequence \(F(t+1) = F(t) + F(t-1)\) starting

with entries \(1,3\), that is, \(1,3,4,7,11,\ldots\). By considering this

as part of a system of equations with the second equation \(F(t) =

F(t)\), find a closed formula for \(F(t)\).

(6) Let \(V\) be the vector space of bounded sequences of real numbers,

and let \(T: V \to V\) be the ``shift linear transformation'' \((Tf)_n =

f_{n+1}\) for all \(n \ge 0\) and all sequences \((f_n)\). Find the set of eigenvalues for \(T\).

Problem Set 8, Math 350, Fall 2017

(Problems in pdf)

(1) (a) Two square matrices \(A,B\) of the same size are said to {\em commute} if \(AB = BA\).

Show that if \(A,B\) commute then \(A + B,B\) commute as well.

(b) Let \(A, B\) be commuting matrices. Show that if \(v\) is an

eigenvector of \(A\), then \(Bv\) is also an eigenvector with the same

eigenvalue.

(c) Show that if \(A,B\) commute then any eigenvalue for \(A + B\) is the

sum of an eigenvalue for \(A\) and an eigenvalue for \(B\). (Hint: If

\(E_\lambda(A + B)\) is an eigenspace for \(A + B\), then by parts (a) and

(b) we have that \(B E_\lambda(A + B) \subseteq E_\lambda(A + B)\). Let

\(v\) be an eigenvector for \(B\) in \(E_\lambda(A + B)\) .... )

(d) The {\em discrete Laplacian} (or discrete second derivative) is

the operator \(L: R^n \to R^n\) defined by \( (Lv)_k = (v_{k+1} - v_k) -

(v_k - v_{k-1})\) where the indices are taken mod \(n\) (so \(v_{n+1} :=

v_1\)). Find the matrix of \(L\) with respect to the standard basis.

(Your answer may use the \(\ldots\) notation. )

(e) Find the eigenvalues of the discrete Laplacian from part (d).

(Hint: Write the matrix for \(L\) as a sum of commuting matrices whose

eigenvalues are easy to find and use part (c).)

(2) (a) A square matrix \(A\) is {\em stochastic} if its entries are

non-negative and the columns sum to \(1\). Show that the product of two

stochastic matrices of the same size is stochastic.

(b) Show that for any stochastic \(n \times n\) matrix \(A\) and standard

basis vector \(e_j\), the entries of \(A^k e_j\) are between \(0\) and \(1\).

Use this to show that for any \(n\) vector \(v = [v_1 \ v_2 \ \ldots \

v_n ]\) each entry of \(A^k v\) is between \( -

( |v_1| + \ldots +

|v_n|)\)

and

\(+( |v_1| + \ldots +

|v_n|)\).

(c) Using (b) show that the eigenvalues \(\lambda\) of a stochastic

matrix satisfy \(| \lambda| \leq 1\).

Problem Set 9, Math 350, Fall 2017

(Problems in pdf. Answers and practice problems for the final.)

(1) (a) Suppose that \(v\) is the three-vector \([ 1 \ 0 \ 1 ]\). Find the matrix \(A\) for orthogonal projection onto the span of \(v\). (b) Check explicitly that \(A^2 = A\).

(c) Find the eigenvalues and eigevectors of \(A\).

(2) Let \(V\) be an inner product space. Show that if \(P_W: V \to V\) is orthogonal projection onto a subspace \(W\), then \(P_W\) satisfies the relation \(\langle P_W v_1 , v_2 \rangle = \langle v_1, P_W v_2 \rangle\). (Hint: write out \(v_2\) as a sum of its projections and similarly for \(v_1\).)

(3) Show an isomorphism \(T: V \to V\) is an orthogonal transformation iff for any

orthonormal set \(B\), the image \(T(B)\) is also orthonormal.

(4) Find the orthogonal diagonalization of the matrix \(A = \left[ \begin{array}{lll}

1 & 1 & 1 \\

1 & 1 & 1 \\

1 & 1 & 1 \end{array} \right]\), that is, find an orthogonal matrix \(Q\) and a diagonal matrix \(D\) such that \( A = QDQ^T\). (Hint: find an orthogonal eigenbasis and use it to form \(Q\).)

(5) True or false: \( T: C^\infty([-1,1]) \to C^\infty([-1,1]), (Tf)(x) = f(-x)\) is an orthogonal transformation.

Prove your answer.

(6) Let \(A\) be an \(n \times n\) matrix. For each eigenvalue \(\lambda\) prove that the generalized eigenspace \(\tilde{E}_\lambda = \{v | \exists k, (A- \lambda I)^k v = 0 \}\) is a subspace of \(\mathbb{R}^n\).

(7) Find a basis of generalized eigenvectors for the following matrices.

In each case write down the Jordan form.

(a)

\(A = \left[ \begin{array}{lll}

1 & 0 & 0 \\

1 & 2 & 0 \\

0 & 1 & 1 \end{array} \right]\),

(b)

\(A = \left[ \begin{array}{lll}

2 & 0 & 0 \\

1 & 2 & 0 \\

1 & 1 & 2 \end{array} \right]\),

(c)

\(A = \left[ \begin{array}{lll}

2 & 0 & 4 \\

0 & 2 & 0 \\

0 & 0 & 2 \end{array} \right]\),

(8) Find all possible \(2 \times 2\) Jordan matrices \(A\) satisfying \(A^3 = A\) and \(A^2 \neq A\). Justify your answer.

#### Practice Exam for First Midterm

(Problems in pdf)

(1) Find all polynomials \(f(x)\) of degree at most \(3\) satisfying \(f'(x) = f(x+1) - f(x)\).

(2) True or false: the functions \(1^x, 2^x, 3^x\) are linearly independent as functions of a real variable \(x\). Prove your answer.

(3) Show that the set of eventually-constant sequences (like \(3,2,1,1,1,1,\ldots \)) is a subspace of the vector space of sequences of real numbers. Find its dimension, and, if possible, a basis.

(4) Let \(V = W = \operatorname{span} B\) where \(B = \{ \sin(x),\cos(x) \}\). Let \(T: V \to W, (Tf)(x) = f(x + \pi/4)\). Find the matrix \(A\) of \(T\) with respect to \(B\). (1 point extra credit: what is \(A^8\) and why?)

(5) Let \(V = P_2\) and \(W = P_4\) and let \(T: V \to W\) be the function \((T f)(x) = f''(x) + x^2f(x)\). Prove that \(T\) is a linear transformation. What are the null space and range of \(T\)?

Additional Problems:

(6) Let \(A\) be the \( 3 \times 3\) matrix whose \(ij\)-th entry is \(i + j\) and let \(T\) be multiplication by \(A\). Find the range and nullspace of \(T\).

(7) Let \(V = W = \operatorname{span} B, B = \{ 1^x, 2^x, 3^x \}\) where \(x\) is a real variable Find the matrix of the linear transformation \(T(f) = f'\) with respect to \(B\).

(8) Show that \(V = \mathbb{R}^2\) with addition defined by \((x_1,y_1) + (x_2,y_2) = (x_1 + x_2,y_1y_2)\) and scalar multiplication \(c(x,y) = (cx,y)\) is not a vector space.

(9) Show that if \(W_1\) and \(W_2\) are subspace of a vector space \(V\) then

\(\dim(W_1 + W_2) \leq \dim(W_1) + \dim(W_2)\).

(10) Show that if \(T: V \to W\) is a one-to-one linear transformation and \(\{ v_1,\ldots, v_k \} \) is a linearly independent subset of \(V\) then \( \{ T(v_1),\ldots, T(v_k) \}\) is a linearly independent subset of \(W\).

(11) Let \(T: V \to V\) be a linear transformation. Show that \(rank(T^2) \leq rank(T)\) and \(nullity(T^2) \ge nullity(T)\).

(12) True/false: explain your answer in each case. (a) Any vector space has a finite basis. (b) A set \(\{ v_1,\ldots, v_k \}\) is linearly dependent iff the last vector \(v_k\) is a linear combination of the others. (c) The empty set is a subspace of any vector space. (d) Any spanning set for a vector space contains a basis. (e) A linear transformation maps the zero vector to the zero vector. (f) If a linear transformation from \(V\) to \(V\) is onto, then so is its square. (g) If the matrix for a linear transformation has a zero column, then the linear tranformation is not one-to-one. (h) Any basis for \(P_n\) (the space of polynomials of degree at most \(n\)) has \(n\) elements. (i) The dimension of the space of \(2 \times 2\) real symmetric matrices is \(2\).

Review problems for the second exam:

(Problems in pdf)

(1)

(a) Find the matrix for the linear transformation \(T: \mathbb{R}^3 \to \mathbb{R}^3\) defined by \((Tv)_k = (v_{k+1} + v_{k-1})/2\) (where the indices are taken mod \(3\), so \(v_4 = v_1, v_0 = v_3\).)

(b) Find the inverse of \(T\).

(c) Find the determinant of \(T\) using row reduction.

(d) Find the eigenvalues of \(T\).

(e) Find the diagonalization of \(T\).

(f) Let \(v = [1 \ 0 \ 0 ]\). Use (e) to find a formula for \(T^n v\) for any \(n\).

(2)True or false: In each case prove your answer either way. (a) Let \(T: V \to W\) be a linear transformation and \(B \subset V\) a basis. If \(T\) is one-to-one then \(T(B)\) is linearly independent. (b) If \(A\) is a square matrix then \(A\) and \(A^T\) have the same eigenvectors. (c) Any square matrix is a product of elementary matrices. (d) The eigenvalues of a symmetric matrix are all real. (e) Any skew-symmetric \(2 \times 2\) matrix is diagonalizable. (f) \(\det(A + B) = \det(A) + \det(B)\) for any square matrices of the same size. (g) \(\det(P_1 P_2) = \det(P_1) \det(P_2)\) for any permutation matrices \(P_1,P_2\). (Do not use \(\det(AB) = \det(A) \det(B)\).) (h) Any two-dimensional inner product space has an orthonormal basis.

(3) True or false: In each case prove your answer either way.

%

(a) Consider the linear transformation \(T: \mathbb{C}^n \to \mathbb{C}^n, e_1 \mapsto e_2, e_2 \mapsto e_3, \ldots,

e_n \mapsto e_1\). Then \(T\) has \(n\) distinct eigenvalues.

%

(b) The map \((f,g) \mapsto \int_{0}^1 x f(x) \overline{g(x)} dx \) defines an inner product

on the space of continuous functions from \(0\) to \(1\).

%

(c) For vectors \(v_1,v_2\) in an inner product space, if \(\Vert v_1 + v_2 \Vert^2 = \Vert v_1 \Vert^2 + \Vert v_2 \Vert^2\)

then \(v_1\) and \(v_2\) are orthogonal.

%

(d) Let \(V\) be the space of bounded functions from the integers \(\mathbb{Z}\) to the complex numbers \(\mathbb{C}\).

The shift operator \((Tf)_k = f_{k-1}\) has infinitely many eigenvalues.

(e) In an inner product space \(V\), for any vector \(x \in V\) we have \( \langle 0, x \rangle = 0\).

(f) In an inner product space \(V\), for any subspace \(W\), if \(w \in W\)

then the orthogonal projection of \(w\) onto \(W\) is equal to \(w\).

(4) (a) Find an orthogonal basis for the subspace \(W = \{ x + y + z +

w = 0 \}\) in \(\mathbb{R}^4\).

(b) Find the linear function \(a + bx\) that best approximates the

quadratic function \(x^2\) on the interval \([0,2]\) with respect to the

standard inner product on \(C^\infty([0,2])\). (That is, find the

function \(a + bx\) closest to \(x^2)\). Draw a graph of the function

\(x^2\) and the approximation you found.

(c) (For additional practice) Make the set of functions

\(\{ 1, x^2, x^4 \}\) on \([0,1]\) into an orthogonal set with respect to

the standard inner product on \(C^\infty([0,1])\).

(5)

(a) Suppose that the reduced row echelon form of a matrix is

\(\left[ \begin{array}{lllll}

1 & 2 & 0 & 1 \\

0 & 0 & 1 & 2 \\

0 & 0 & 0 & 0

\end{array} \right]\)

If the second and fourth columns of the matrix are

\(\left[ \begin{array}{l}

1 \\

2 \\

1

\end{array} \right]\)

and

\(\left[ \begin{array}{l}

1 \\

0 \\

1

\end{array} \right]\)

respectively, what is the matrix?

(b) Find a basis for the null-space of the matrix?

(c) Find a basis for the column-space of the matrix.

(6) (For additional practice)

(a) Suppose that \(V\) is a vector space and

\(U,W\) are subspaces so that \(U + W = V\) and \(U \cap W = \ { 0 \}\).

Show that any element of \(V\) can be written uniquely as a sum of

elements of \(U\) and \(W\).

(b) Suppose that \(S : U \to V\) and \(T: V \to W\) are linear

transformations. Show that \(N(S) \subseteq N(T \circ S)\).

(c) Let \(A\) be a square matrix. Show that if all the columns of \(A\) sum to

zero, then \(A\) is not invertible.

#### Recommended Practice Problems (the problem sets from last year)

Sept. 13 | 1.2 #17; 1.3 #19,23; 1.4 #11,13; 1.5 #9,15 |

Sept. 20 | 1.6 # 20,21,26,29; 1.7 #5,6 |

Sept. 27 | 2.1 #3,11,28; 2.2 #4; 2.3 #12; 2.4 #15,17 |

October 4 | 2.5 #3(d),7(a,b),13; 2.6 #5,10; Show that F[x]* ≅ F[[x]]. |

October 18 | 3.1 #6,12; 3.2 #5(b,d,h),17; 3.3 #8,10; 3.4 #8,15 If an nxn matrix A has each row sum 0, some Ax=b has no solution. |

October 25 | 4.1 #10(a,c); 4.2 #23; 4.3 #12,22(c),25(c); 4.4 #6; 4.5 #11,12 |

Nov. 1 | 5.1 #3(b),20,21; 5.2 #4,9(a),12; Show that the cross product induces an isomorphism between R³ and Λ²(R³). |

Nov. 8 | 5.2 #18(a),21; 5.3 #2(d,f); 5.4 #6(a),13,19,25 |

Nov. 15 | 7.1 #3(b),9(a),13; 7.2 #3,14,19(a); 7.3 #13,14; Find all 4x4 Jordan canonical forms of T satisfying T²=T³. |

Dec. 13 | 6.1; #6,11,12,17; 6.2 #2a,6,11; 6.8 #4(a,c,d),11 |