- Find the best solution to the system
- Find an orthonormal basis for ℝ
^{3},given that w_{1}is a multiple of the vector. - Suppose A = A
^{T}is a symmetric real n × n matrix which has all positive eigenvalues. DefineShow this is an inner product on ℝ

^{n}. What does the Cauchy Schwarz inequality say in this case? - Let
_{∞}≡ max. Show this is a norm on ℂ^{n}. HereShow

where the above is the usual inner product on ℂ

^{n}. - Let
_{1}≡∑_{j=1}^{n}.Show this is a norm on ℂ^{n}. Here x =^{T}. Showwhere the above is the usual inner product on ℂ

^{n}. Show there cannot exist an inner product such that this norm comes from the inner product as described above for inner product spaces. - Show that if is any norm on any vector space, then≤.
- Relax the assumptions in the axioms for the inner product. Change the axiom about
≥ 0 and equals 0 if and only if x = 0 to simply read≥ 0. Show the Cauchy Schwarz inequality still holds in the following form.≤
^{1∕2}^{1∕2}. - Let H be an inner product space and let
_{k=1}^{n}be an orthonormal basis for H. Show - Let the vector space V consist of real polynomials of degree no larger than 3. Thus a typical
vector is a polynomial of the form a + bx + cx
^{2}+ dx^{3}. For p,q ∈ V define the inner product,≡∫_{0}^{1}pqdx. Show this is indeed an inner product. Then state the Cauchy Schwarz inequality in terms of this inner product. Showis a basis for V . Finally, find an orthonormal basis for V. This is an example of some orthonormal polynomials. - Let P
_{n}denote the polynomials of degree no larger than n − 1 which are defined on an interval. Letbe n distinct points in. Now define for p,q ∈ P_{n},Show this yields an inner product on P

_{n}. Hint: Most of the axioms are obvious. The one which says= 0 if and only if p = 0 is the only interesting one. To verify this one, note that a nonzero polynomial of degree no more than n − 1 has at most n − 1 zeros. - Let Cdenote the vector space of continuous real valued functions defined on. Let the inner product be given as
Show this is an inner product. Also let V be the subspace described in Problem 9. Using the result of this problem, find the vector in V which is closest to x

^{4}. - A regular Sturm Liouville problem involves the differential equation, for an unknown
function of x which is denoted here by y,
and it is assumed that p

,q> 0 for any t ∈and also there are boundary conditions,There is an immense theory connected to these important problems. The constant, λ is called an eigenvalue. Show that if y is a solution to the above problem corresponding to λ = λ

_{1}and if z is a solution corresponding to λ = λ_{2}≠λ_{1}, then(11.9) and this defines an inner product. Hint: Do something like this:

and then integrate. Use the boundary conditions to show that y

^{′}z− z^{′}y= 0 and y^{′}z−z^{′}y= 0 . The formula, 11.9 is called an orthogonality relation. It turns out there are typically infinitely many eigenvalues and it is interesting to write given functions as an infinite series of these “eigenfunctions”. - Consider the continuous functions defined on ,C. Show≡∫
_{0}^{π}fgdx is an inner product on this vector space. Show the functions_{n=1}^{∞}are an orthonormal set. What does this mean about the dimension of the vector space C? Now let V_{N}= span. For f ∈ Cfind a formula for the vector in V_{N}which is closest to f with respect to the norm determined from the above inner product. This is called the N^{th}partial sum of the Fourier series of f. An important problem is to determine whether and in what way this Fourier series converges to the function f. The norm which comes from this inner product is sometimes called the mean square norm. - Consider the subspace V ≡ kerwhere
Find an orthonormal basis for V. Hint: You might first find a basis and then use the Gram Schmidt procedure.

- The Gram Schmidt process starts with a basis for a subspace and produces an orthonormal basis for the same subspacesuch that
for each k. Show that in the case of ℝ

^{m}the QR factorization does the same thing. More specifically, ifand if

then the vectors

is an orthonormal set of vectors and for each k, - Verify the parallelogram identify for any inner product space,
Why is it called the parallelogram identity?

- Let H be an inner product space and let K ⊆ H be a nonempty convex subset.
This means that if k
_{1},k_{2}∈ K, then the line segment consisting of points of the formis also contained in K. Suppose for each x ∈ H, there exists Px defined to be a point of K closest to x. Show that Px is unique so that P actually is a map. Hint: Suppose z

_{1}and z_{2}both work as closest points. Consider the midpoint,∕2 and use the parallelogram identity of Problem 16 in an auspicious manner. - In the situation of Problem 17 suppose K is a closed convex subset and that H is complete.
This means every Cauchy sequence converges. Recall from calculus a sequence
is a Cauchy sequence if for every ε > 0 there exists N
_{ε}such that whenever m,n > N_{ε}, it follows< ε. Letbe a sequence of points of K such thatThis is called a minimizing sequence. Show there exists a unique k ∈ K such that lim

_{n→∞}and that k = Px. That is, there exists a well defined projection map onto the convex subset of H. Hint: Use the parallelogram identity in an auspicious manner to showis a Cauchy sequence which must therefore converge. Since K is closed it follows this will converge to something in K which is the desired vector. - Let H be an inner product space which is also complete and let P denote the projection map
onto a convex closed subset, K. Show this projection map is characterized by the
inequality
for all k ∈ K. That is, a point z ∈ K equals Px if and only if the above variational inequality holds. This is what that inequality is called. This is because k is allowed to vary and the inequality continues to hold for all k ∈ K.

- Using Problem 19 and Problems 17 - 18 show the projection map, P onto a closed convex
subset is Lipschitz continuous with Lipschitz constant 1. That is
- Give an example of two vectors in ℝ
^{4}or ℝ^{3}x,y and a subspace V such that x ⋅ y = 0 but Px⋅Py≠0 where P denotes the projection map which sends x to its closest point on V . - Suppose you are given the data, ,,,. Find the linear regression line using the formulas derived above. Then graph the given data along with your regression line.
- Generalize the least squares procedure to the situation in which data is given and you desire
to fit it with an expression of the form y = af+ bg+ c where the problem would be to find a,b and c in order to minimize the error. Could this be generalized to higher dimensions? How about more functions?
- Let A ∈ℒwhere X and Y are finite dimensional vector spaces with the dimension of X equal to n. Define rank≡ dimand nullity≡ dim. Show that nullity+ rank= dim. Hint: Let
_{i=1}^{r}be a basis for kerand let_{i=1}^{r}∪_{i=1}^{n−r}be a basis for X. Then show that_{i=1}^{n−r}is linearly independent and spans AX. - Let A be an m×n matrix. Show the column rank of A equals the column rank of A
^{∗}A. Next verify column rank of A^{∗}A is no larger than column rank of A^{∗}. Next justify the following inequality to conclude the column rank of A equals the column rank of A^{∗}.Hint: Start with an orthonormal basis,

_{j=1}^{r}of Aand verify_{j=1}^{r}is a basis for A^{∗}A. - Let A be a real m × n matrix and let A = QR be the QR factorization with Q
orthogonal and R upper triangular. Show that there exists a solution x to the
equation
and that this solution is also a least squares solution defined above such that A

^{T}Ax = A^{T}b.

Download PDFView PDF