- If A,B, and C are each n × n matrices and ABC is invertible, why are each of A,B, and C invertible?
- Give an example of a 3×2 matrix with the property that the linear transformation determined by this matrix is one to one but not onto.
- Explain why Ax = 0 always has a solution whenever A is a linear transformation.
- Recall that a line in ℝ
^{n}is of the form x+tv where t ∈ ℝ. Recall that v is a “direction vector”. Show that if T : ℝ^{n}→ ℝ^{m}is linear, then the image of T is either a line or a point. - In the following examples, a linear transformation, T is given by specifying its action on a
basis β. Find its matrix with respect to this basis.
- T= 2+ 1,T=
- T= 2+ 1,T=
- T= 2+ 1,T= 1−

- T
- ↑In each example above, find a matrix A such that for every x ∈ ℝ
^{2},Tx = Ax. - Consider the linear transformation T
_{θ}which rotates every vector in ℝ^{2}through the angle of θ. Find the matrix A_{θ}such that T_{θ}x = A_{θ}x. Hint: You need to have the columns of A_{θ}be Te_{1}and Te_{2}. Review why this is before using this. Then simply find these vectors from trigonometry. - ↑If you did the above problem right, you got
Derive the famous trig. identities for the sum of two angles by using the fact that A

_{θ+ϕ}= A_{θ}A_{ϕ}and the above description. - Let β = be a basis for F
^{n}and let T : F^{n}→ F^{n}be defined as follows.First show that T is a linear transformation. Next show that the matrix of T with respect to this basis is

_{β}=Show that the above definition is equivalent to simply specifying T on the basis vectors of β by

- Let T be given by specifying its action on the vectors of a basis β = as follows.
Letting A =

, verify that_{β}= A. It is done in the chapter, but go over it yourself. Show that_{γ}=(5.8) - Let a be a fixed vector. The function T
_{a}defined by T_{a}v = a + v has the effect of translating all vectors by adding a. Show this is not a linear transformation. Explain why it is not possible to realize T_{a}in ℝ^{3}by multiplying by a 3 × 3 matrix. - ↑In spite of Problem 11 we can represent both translations and rotations by matrix multiplication at
the expense of using higher dimensions. This is done by the homogeneous coordinates. I will illustrate
in ℝ
^{3}where most interest in this is found. For each vector v =^{T}, consider the vector in ℝ^{4}^{T}. What happens when you doDescribe how to consider both rotations and translations all at once by forming appropriate 4 × 4 matrices.

- You want to add to every point in ℝ
^{3}and then rotate about the z axis counter clockwise through an angle of 30^{∘}. Find what happens to the point. - Let P
_{3}denote the set of real polynomials of degree no more than 3, defined on an interval. Show that P_{3}is a subspace of the vector space of all functions defined on this interval. Show that a basis for P_{3}is. Now let D denote the differentiation operator which sends a function to its derivative. Show D is a linear transformation which sends P_{3}to P_{3}. Find the matrix of this linear transformation with respect to the given basis. - Generalize the above problem to P
_{n}, the space of polynomials of degree no more than n with basis. - If A is an n × n invertible matrix, show that A
^{T}is also and that in fact,^{−1}=^{T}. - Suppose you have an invertible n × n matrix A. Consider the polynomials
Show that these polynomials p

_{1},,p_{n}are a linearly independent set of functions. - Let the linear transformation be T = D
^{2}+ 1, defined as Tf = f^{′′}+ f. Find the matrix of this linear transformation with respect to the given basis. - Let L be the linear transformation taking polynomials of degree at most three to polynomials of
degree at most three given by
where D is the differentiation operator. Find the matrix of this linear transformation relative to the basis

. Find the matrix directly and then find the matrix with respect to the differential operator D + 1 and multiply this matrix by itself. You should get the same thing. Why? - Let L be the linear transformation taking polynomials of degree at most three to polynomials of
degree at most three given by D
^{2}+ 5D + 4 where D is the differentiation operator. Find the matrix of this linear transformation relative to the bases. Find the matrix directly and then find the matrices with respect to the differential operators D + 1,D + 4 and multiply these two matrices. You should get the same thing. Why? - Suppose A ∈ℒwhere dim> dim. Show ker≠. That is, show there exist nonzero vectors v ∈ V such that Av = 0.
- A vector v is in the convex hull of a nonempty set if there are finitely many vectors of
S,and nonnegative scalarssuch that
Such a linear combination is called a convex combination. Suppose now that S ⊆ V, a vector space of dimension n. Show that if v =∑

_{k=1}^{m}t_{k}v_{k}is a vector in the convex hull for m > n + 1, then there exist other scalarssuch thatThus every vector in the convex hull of S can be obtained as a convex combination of at most n + 1 points of S. This incredible result is in Rudin [29]. Hint: Consider L : ℝ

^{m}→ V × ℝ defined byExplain why ker

≠. Next, letting a ∈ ker∖and λ ∈ ℝ, note that λa ∈ker. Thus for all λ ∈ ℝ,Now vary λ till some t

_{k}+ λa_{k}= 0 for some a_{k}≠0. - For those who know about compactness, use Problem 22 to show that if S ⊆ ℝ
^{n}and S is compact, then so is its convex hull. - Show that if L ∈ℒ(linear transformation) where V and W are vector spaces, then if Ly
_{p}= f for some y_{p}∈ V, then the general solution of Ly = f is of the form ker+ y_{p}. - Suppose Ax = b has a solution. Explain why the solution is unique precisely when Ax = 0 has only the trivial (zero) solution.
- Let L : ℝ
^{n}→ ℝ be linear. Show that there exists a vector a ∈ ℝ^{n}such that Ly = a^{T}y. - Let the linear transformation T be determined by
Find the rank of this transformation.

- Let Tf = f for f in the vector space of polynomials of degree no more than 3 where we consider T to map into the same vector space. Find the rank of T. You might want to use Proposition 4.3.6.
- (Extra important) Let A be an n×n matrix. The trace of A,traceis defined as ∑
_{i}A_{ii}. It is just the sum of the entries on the main diagonal. Show trace= trace. Suppose A is m×n and B is n × m. Show that trace= trace. Now show that if A and B are similar n × n matrices, then trace= trace. Recall that A is similar to B means A = S^{−1}BS for some matrix S. - Suppose you have a monic polynomial ϕwhich is irreducible over F the field of scalars. Remember that this means that no polynomial divides it except scalar multiples of ϕand scalars. Say
Now consider A ∈ℒ

where V is a vector space. Consider kerand suppose this is not 0. For x ∈ ker,x≠0, let β_{x}=. Show that β_{x}is an independent set of vectors if x≠0. - ↑Let V be a finite dimensional vector space and let A ∈ℒ. Also let W be a subspace of V such that A⊆ W. We call such a subspace an A invariant subspace. Sayis a basis for W. Also let x ∈ U\W where U is an A invariant subspace which is contained in ker. Then you know thatis linearly independent. Show that in factis linearly independent where β
_{x}is given in the above problem. Hint: Suppose you have(*) You need to verify that the second sum is 0. From this it will follow that each b

_{j}is 0 and then each a_{k}= 0. Let S = ∑_{j=1}^{d}b_{j}A^{j−1}x. Observe that β_{S}⊆ β_{x}and if S≠0, then β_{S}is independent from the above problem and both β_{x}and β_{S}have the same dimension. You will argue that span⊆ W ∩ span⊆ spanand then use Problem 6 on Page 190.. - ↑In the situation of the above problem, show that there exist finitely many vectors in U, such thatis a basis for U + W. This last vector space is defined as the set of all y + w where y ∈ U and w ∈ W.
- ↑ In the situation of the above where ϕis irreducible. Let U be defined as
Explain why U ⊆ ker

. Suppose you have a linearly independent set in U which is. Here the notation meanswhere these vectors are independent but A

^{m}x is in the span of these. Such exists any time you have x ∈ kerfor ga polynomial. Letting ϕy_{i}= x_{i}, explain whyis also linearly independent. This is like the theorem presented earlier that the inverse image of a linearly independent set is linearly independent but it is more complicated here because instead of single vectors, we are considering sets β_{x}.

Download PDFView PDF