- Show
^{∗}= A and^{∗}= B^{∗}A^{∗}. - Prove Corollary 14.11.10.
- Show that if A is an n × n matrix which has an inverse then A
^{+}= A^{−1}. - Using the singular value decomposition, show that for any square matrix A, it follows that
A
^{∗}A is unitarily similar to AA^{∗}. - Let A,B be a m × n matrices. Define an inner product on the set of m × n matrices
by
Show this is an inner product satisfying all the inner product axioms. Recall for M an n×n matrix, trace

≡∑_{i=1}^{n}M_{ii}. The resulting norm,_{F}is called the Frobenius norm and it can be used to measure the distance between two matrices. - It was shown that a matrix A is normal if and only if it is unitarily similar to a diagonal matrix. It was also shown that if a matrix is Hermitian, then it is unitarily similar to a real diagonal matrix. Show the converse of this last statement is also true. If a matrix is unitarily similar to a real diagonal matrix, then it is Hermitian.
- Let A be an m×n matrix. Show
_{F}^{2}≡_{F}= ∑_{j}σ_{j}^{2}where the σ_{j}are the singular values of A. - If A is a general n×n matrix having possibly repeated eigenvalues, show there is a sequence of n×n matrices having distinct eigenvalues which has the property that the ij
^{th}entry of A_{k}converges to the ij^{th}entry of A for all ij. Hint: Use Schur’s theorem. - Prove the Cayley Hamilton theorem as follows. First suppose A has a basis of eigenvectors
_{k=1}^{n},Av_{k}= λ_{k}v_{k}. Let pbe the characteristic polynomial. Show pv_{k}= pv_{k}= 0. Then sinceis a basis, it follows px = 0 for all x and so p= 0 . Next in the general case, use Problem 8 to obtain a sequenceof matrices whose entries converge to the entries of A such that A_{k}has n distinct eigenvalues and therefore by Theorem 6.5.1 on Page 386 A_{k}has a basis of eigenvectors. Therefore, from the first part and for p_{k}the characteristic polynomial for A_{k}, it follows p_{k}= 0 . Now explain why and the sense in which lim_{k→∞}p_{k}= p. - Show directly that if A is an n×n matrix and A = A
^{∗}(A is Hermitian) then all the eigenvalues are real and eigenvectors can be assumed to be real and that eigenvectors associated with distinct eigenvalues are orthogonal, (their inner product is zero). - Let v
_{1},,v_{n}be an orthonormal basis for F^{n}. Let Q be a matrix whose i^{th}column is v_{i}. Show - Show that an n×n matrix Q is unitary if and only if it preserves distances. This means =. This was done in the text but you should try to do it for yourself.
- Suppose andare two orthonormal bases for F
^{n}and suppose Q is an n × n matrix satisfying Qv_{i}= w_{i}. Then show Q is unitary. If= 1, show there is a unitary transformation which maps v to e_{1}. This is done in the text but do it yourself with all details. - Let A be a Hermitian matrix so A = A
^{∗}and suppose all eigenvalues of A are larger than δ^{2}. ShowWhere here, the inner product is

≡∑_{j=1}^{n}v_{j}u_{j}. - The discrete Fourier transform maps ℂ
^{n}→ ℂ^{n}as follows.Show that F

^{−1}exists and is given by the formulaHere is one way to approach this problem. Note z = Ux where

Now argue U is unitary and use this to establish the result. To show this verify each row has length 1 and the inner product of two different rows gives 0. Now U

_{kj}= e^{−i}jk and so_{kj}= e^{i2π n jk}. - Let f be a periodic function having period 2π. The Fourier series of f is an expression of the
form
and the idea is to find c

_{k}such that the above sequence converges in some way to f. Ifand you formally multiply both sides by e

^{−imx}and then integrate from 0 to 2π, interchanging the integral with the sum without any concern for whether this makes sense, show it is reasonable from this to expectNow suppose you only know f

at equally spaced points 2 πj∕n for j = 0,1,,n. Consider the Riemann sum for this integral obtained from using the left endpoint of the subintervals determined from the partition_{j=0}^{n}. How does this compare with the discrete Fourier transform? What happens as n →∞ to this approximation? - Suppose A is a real 3 × 3 orthogonal matrix (Recall this means AA
^{T}= A^{T}A = I. ) having determinant 1. Show it must have an eigenvalue equal to 1. Note this shows there exists a vector x≠0 such that Ax = x. Hint: Show first or recall that any orthogonal matrix must preserve lengths. That is,=. - Let A be a complex m×n matrix. Using the description of the Moore Penrose inverse in terms of the
singular value decomposition, show that
where the convergence happens in the Frobenius norm. Also verify, using the singular value decomposition, that the inverse exists in the above formula. Observe that this shows that the Moore Penrose inverse is unique.

- Show that A
^{+}=^{+}A^{∗}. Hint: You might use the description of A^{+}in terms of the singular value decomposition. - In Theorem 14.10.1. Show that every matrix which commutes with A also commutes with A
^{1∕k}the unique nonnegative self adjoint k^{th}root. - Let X be a finite dimensional inner product space and let β = be an orthonormal basis for X. Let A ∈ℒbe self adjoint and nonnegative and let M be its matrix with respect to the given orthonormal basis. Show that M is nonnegative, self adjoint also. Use this to show that A has a unique nonnegative self adjoint k
^{th}root. - Let A be a complex m × n matrix having singular value decomposition U
^{∗}AV =as explained above, where σ is k × k. Show thatthe last n − k columns of V .

- The principal submatrices of an n×n matrix A are A
_{k}where A_{k}consists those entries which are in the first k rows and first k columns of A. Suppose A is a real symmetric matrix and that x →is positive definite. This means that if x≠0, then> 0. Show that each of the principal submatrices are positive definite. Hint: ConsiderAwhere x consists of k entries. - ↑A matrix A has an LU factorization if it there exists a lower triangular matrix L having all ones on
the diagonal and an upper triangular matrix U such that A = LU. Show that if A is a symmetric
positive definite n×n real matrix, then A has an LU factorization with the property that each entry
on the main diagonal in U is positive. Hint: This is pretty clear if A is 1×1. Assume true for
×. Then
Then as above, Â is positive definite. Thus it has an LU factorization with all positive entries on the diagonal of U. Notice that, using block multiplication,

Now consider that matrix on the right. Argue that it is of the form

Ũ where Ũ has all positive diagonal entries except possibly for the one in the n^{th}row and n^{th}column. Now explain why det> 0 and argue that in fact all diagonal entries of Ũ are positive. - ↑Let A be a real symmetric n×n matrix and A = LU where L has all ones down the diagonal and U has all positive entries down the main diagonal. Show that A = LDH where L is lower triangular and H is upper triangular, each having all ones down the diagonal and D a diagonal matrix having all positive entries down the main diagonal. In fact, these are the diagonal entries of U.
- ↑Show that if L,L
_{1}are lower triangular with ones down the main diagonal and H,H_{1}are upper triangular with all ones down the main diagonal and D, D_{1}are diagonal matrices having all positive diagonal entries, and if LDH = L_{1}D_{1}H_{1}, then L = L_{1},H = H_{1},D = D_{1}. Hint: Explain why D_{1}^{−1}L_{1}^{−1}LD = H_{1}H^{−1}. Then explain why the right side is upper triangular and the left side is lower triangular. Conclude these are both diagonal matrices. However, there are all ones down the diagonal in the expression on the right. Hence H = H_{1}. Do something similar to conclude that L = L_{1}and then that D = D_{1}. - ↑Show that if A is a symmetric real matrix such that x →is positive definite, then there exists a lower triangular matrix L having all positive entries down the diagonal such that A = LL
^{T}. Hint: From the above, A = LDH where L,H are respectively lower and upper triangular having all ones down the diagonal and D is a diagonal matrix having all positive entries. Then argue from the above problem and symmetry of A that H = L^{T}. Now modify L by making it equal to LD^{1∕2}. This is called the Cholesky factorization. - Given F ∈ℒwhere X,Y are inner product spaces and dim= n ≤ m = dim, there exists R,U such that U is nonnegative and Hermitian (U = U
^{∗}) and R^{∗}R = I such that F = RU. Show that U is actually unique and that R is determined on U. This was done in the book, but try to remember why this is so. - If A is a complex Hermitian n × n matrix which has all eigenvalues nonnegative, show that there exists a complex Hermitian matrix B such that BB = A.
- ↑Suppose A,B are n × n real Hermitian matrices and they both have all nonnegative
eigenvalues. Show that det≥ det+ det. Hint: Use the above problem and the Cauchy Binet theorem. Let P
^{2}= A,Q^{2}= B where P,Q are Hermitian and nonnegative. Then - Suppose B = is an×Hermitian nonnegative matrix where α is a scalar and A is n × n. Show that α must be real, c = b, and A = A
^{∗},A is nonnegative, and that if α = 0, then b = 0. Otherwise, α > 0. - ↑If A is an n × n complex Hermitian and nonnegative matrix, show that there exists an upper
triangular matrix B such that B
^{∗}B = A. Hint: Prove this by induction. It is obviously true if n = 1. Now if you have an×Hermitian nonnegative matrix, then from the above problem, it is of the form,α real. - ↑ Suppose A is a nonnegative Hermitian matrix (all eigenvalues are nonnegative) which is partitioned
as
where A

_{11},A_{22}are square matrices. Show that det≤ detdet. Hint: Use the above problem to factor A gettingNext argue that A

_{11}= B_{11}^{∗}B_{11},A_{22}= B_{12}^{∗}B_{12}+ B_{22}^{∗}B_{22}. Use the Cauchy Binet theorem to argue that det= det≥ det. Then explain why - ↑ Prove the inequality of Hadamard. If A is a Hermitian matrix which is nonnegative (all eigenvalues
are nonnegative), then det≤∏
_{i}A_{ii}.

Download PDFView PDF