- For the linear transformation determined by multiplication by the following matrices, find the
minimum polynomial.
- Here is a matrix:
Its minimum polynomial is λ

^{3}− 3λ^{2}+ 4λ − 2 =. Obtain a block diagonal matrix similar to this one. - Suppose A ∈ℒwhere V is a finite dimensional vector space and suppose pis the minimum polynomial. Say p= λ
^{m}+ a_{m−1}λ^{m−1}++ a_{1}λ + a_{0}. If A is one to one, show that it is onto and also that A^{−1}∈ℒ. In this case, explain why a_{0}≠0. In this case, give a formula for A^{−1}as a polynomial in A. - Let A = . Its minimum polynomial is λ
^{2}− 3λ + 2. Find A^{10}exactly. Hint: You can do long division and get λ^{10}= l+ 1023 λ − 1022. - Suppose A ∈ℒand it has minimum polynomial pwhich has degree m. It is desired to compute A
^{n}for n large. Show that it is possible to obtain A^{n}in terms of a polynomial in A of degree less than m. - Determine whether the following matrices are diagonalizable. Assume the field of scalars is
ℂ.
- where a is a real number.

- The situation for diagonalizability was presented for the situation in which the minimum polynomial
factors completely as a product of linear factors since this is certainly the case of most
interest, including ℂ. What if the minimum polynomial does not split? Is there a theorem
available that will allow one to conclude that the matrix is diagonalizable in a splitting field,
possibly larger than the given field? It is a reasonable question because the assumption that
p,p
^{′}are relatively prime may be determined without factoring the polynomials and involves only computations involving the given field F. If you enlarge the field, what happens to the minimum polynomial? Does it stay the same or does it change? Remember, the matrix has entries all in the smaller field F while a splitting field is G larger than F, but you can determine the minimum polynomial using row operations on vectors in F^{n2 }. - Suppose V is a finite dimensional vector space and suppose N ∈ℒsatisfies N
^{m}= 0 for some m ≥ 1. Show that the only eigenvalue is 0. - Suppose V is an n dimensional vector space and suppose β is a basis for V. Consider the
map μI : V → V given by μIv = μv. What is the matrix of this map with respect to
the basis β? Hint: You should find that it is μ times the identity matrix whose ij
^{th}entry is δ_{ij}which is 1 if i = j and 0 if i≠j. Thus the ij^{th}entry of this matrix will be μδ_{ij}. - In the case that the minimum polynomial factors, which was discussed above, we had
If V

_{i}= ker^{ki}, then by definition,^{ki}= 0 where here L_{i}is the restriction of L to V_{i}. If N = L_{i}− μ_{i}I, then N : V_{i}→ V_{i}and N^{ki}= 0. This is the definition of a nilpotent transformation, one which has a high enough power equal to 0. Suppose then that N : V → V where V is an m dimensional vector space. We will show that there is a basis for V such that with respect to this basis, the matrix of N is block diagonal and of the formwhere N

_{i}is an r_{i}× r_{i}matrix of the formThat is, there are ones down the superdiagonal and zeros everywhere else. Now consider the case where N

_{i}= L_{i}− μ_{i}I on one of the V_{i}as just described. Use the preceding problem and the special basis β_{i}just described for N_{i}to show that the matrix of L_{i}with respect to this basis is of the formwhere J

_{r}is of the formThis is called a Jordan block. Now let β =

. Explain why the matrix of L with respect to this basis is of the formThis special matrix is called the Jordan canonical form. This problem shows that it reduces to the study of the matrix of a nilpotent matrix. You see that it is a block diagonal matrix such that each block is a block diagonal matrix which is also an upper triangular matrix having the eigenvalues down the main diagonal and strings of ones on the super diagonal.

- Now in this problem, the method for finding the special basis for a nilpotent transformation is given.
Let V be a vector space and let N ∈ℒbe nilpotent. First note the only eigenvalue of N is 0. Why? (See Problem 8.) Let v
_{1}be an eigenvector. Thenis called a chain based on v_{1}if Nv_{k+1}= v_{k}for all k = 1,2,,r and v_{1}is an eigenvector so Nv_{1}= 0. It will be called a maximal chain if there is no solution v, to the equation, Nv = v_{r}. Now there will be a sequence of steps leading to the desired basis.- Show that the vectors in any chain are linearly independent and for
a chain based on v

_{1},(6.3) Also if

is a chain, then r ≤ n. Hint: If 0 = ∑_{i=1}^{r}c_{i}v_{i}, and the last nonzero scalar occurs at l, do N^{l−1}to the sum and see what happens to c_{l}. - Consider the set of all chains based on eigenvectors. Since all have total length no larger than n
it follows there exists one which has maximal length, ≡ B
_{1}. If spancontains all eigenvectors of N, then stop. Otherwise, consider all chains based on eigenvectors not in spanand pick one, B_{2}≡which is as long as possible. Thus r_{2}≤ r_{1}. If spancontains all eigenvectors of N, stop. Otherwise, consider all chains based on eigenvectors not in spanand pick one, B_{3}≡such that r_{3}is as large as possible. Continue this way. Thus r_{k}≥ r_{k+1}. Then show that the above process terminates with a finite list of chainsbecause for any k,

is linearly independent. Hint: From part a. you know this is true if k = 1. Suppose true for k − 1 and letting Ldenote a linear combination of vectors of B_{i}, supposeThen we can assume L

≠0 by induction. Let v_{i}^{k}be the last term in Lwhich has nonzero scalar. Now act on the whole thing with N^{i−1}to find v_{1}^{k}as a linear combination of vectors in, a contradiction to the construction. You fill in the details. - Suppose Nw = 0. (w is an eigenvector). Show that there exist scalars, c
_{i}such thatRecall that v

_{1}^{i}is the eigenvector in the i^{th}chain on which this chain is based. You know that w is a linear combination of the vectors in. This says that in fact it is a linear combination of the bottom vectors in the B_{i}. Hint: You know that w = ∑_{i=1}^{s}L. Let v_{i}^{s}be the last in Lwhich has nonzero scalar. Suppose that i > 1. Now do N^{i−1}to both sides and obtain that v_{1}^{s}is in the span ofwhich is a contradiction. Hence i = 1 and so the only term of Lis one involving an eigenvector. Now do something similar to L,Letc. You fill in details. - If Nw = 0, then w ∈ span. This was what was just shown. In fact, it was a particular linear combination involving the bases of the chains. What if N
^{k}w = 0? Does it still follow that w ∈ span? Show that if N^{k}w = 0, then w ∈ span. Hint: Say k is as small as possible such that N^{k}w = 0. Then you have N^{k−1}w is an eigenvector and soIf N

^{k−1}w is the base of some chain B_{i}, then there is nothing to show. Otherwise, consider the chain N^{k−1}w,N^{k−2}w,,w. It cannot be any longer than any of the chains B_{1},B_{2},,B_{s}why? Therefore, v_{1}^{i}= N^{k−1}v_{k}^{i}. Why is v_{k}^{i}∈ B_{i}? This is where you use that this is no longer than any of the B_{i}. ThusBy induction, (details) w −∑

_{i=1}^{s}c_{i}v_{k}^{i}∈ span. - Since N is nilpotent, ker= V for some m and so all of V is in span.
- Now explain why the matrix with respect to the ordered basis is the kind of thing desired and described in the above problem. Also explain why the size of the blocks decreases from upper left to lower right. To see why the matrix is like the above, consider
where M

_{i}is the i^{th}block and r_{i}is the length of the i^{th}chain.If you have gotten through this, then along with the previous problem, you have proved the existence of the Jordan canonical form, one of the greatest results in linear algebra. It will be considered a different way later. Specifically, you have shown that if the minimum polynomial splits, then the linear transformation has a matrix of the following form:

where without loss of generality, you can arrange these blocks to be decreasing in size from the upper left to the lower right and J

is of the formWhere J

_{r}is the r × r matrix which is of the following formand the blocks J

_{r}can also be arranged to have their size decreasing from the upper left to lower right.

- Show that the vectors in any chain are linearly independent and for
- (Extra important) The following theorem gives an easy condition for which the Jordan canonical form
will be a diagonal matrix.
and suppose,i = 1,2,,m are eigen-pairs such that if i≠j, then λ
_{i}≠λ_{j}. Thenis linearly independent. In words, eigenvectors from distinct eigenvalues are linearly independent.Hint: Suppose ∑

_{i=1}^{k}c_{i}u_{i}= 0 where k is as small as possible such that not all of the c_{i}= 0. Then c_{k}≠0. Explain why k > 1 andNow

Obtain a contradiction of some sort at this point. Thus if the n×n matrix has n distinct eigenvalues, then the corresponding eigenvectors will be a linearly independent set and so the matrix will be diagonal and all the Jordan blocks will be single numbers.

Download PDFView PDF