- For ≠, let f=. Show that this function has a limit as→foron an arbitrary straight line through. Next show that this function fails to have a limit at.
- Here are some scalar valued functions of several variables. Determine which of these functions
are o. Here v is a vector in ℝ
^{n}, v =.- v
_{1}v_{2} - v
_{2}sin - v
_{1}^{2}+ v_{2} - v
_{2}sin - v
_{1}

- v
- Here is a function of two variables. f= x
^{2}y + x^{2}. Find Dfdirectly from the definition. Recall this should be a linear transformation which results from multiplication by a 1 × 2 matrix. Find this matrix. - Let f=. Compute the derivative directly from the definition. This should be the linear transformation which results from multiplying by a 2 × 2 matrix. Find this matrix.
- You have h= gHere x ∈ ℝ
^{n},f∈ ℝ^{m}and g∈ ℝ^{p}. where f,g are appropriately differentiable. Thus Dhresults from multiplication by a matrix. Using the chain rule, give a formula for the ij^{th}entry of this matrix. How does this relate to multiplication of matrices? In other words, you have two matrices which correspond to Dgand DfCall z = g,y = f. ThenExplain the manner in which the ij

^{th}entry of DhisThis is a review of the way we multiply matrices. What is the i

^{th}row of Dgand the j^{th}column of Df? - Find f
_{x},f_{y},f_{z},f_{xy},f_{yx},f_{zy}for the following. Verify the mixed partial derivatives are equal.- x
^{2}y^{3}z^{4}+ sin - sin+ x
^{2}yz

- x
- Suppose f is a continuous function and f : U → ℝ where U is an open set and suppose that x ∈ U
has the property that for all y near x, f≤ f. Prove that if f has all of its partial derivatives at x, then f
_{xi}= 0 for each x_{i}. Hint: Consider f= h. Argue that h^{′}= 0 and then see what this implies about Df. - As an important application of Problem 7 consider the following. Experiments are done at n times,
t
_{1},t_{2},,t_{n}and at each time there results a collection of numerical outcomes. Denote by_{i=1}^{p}the set of all such pairs and try to find numbers a and b such that the line x = at + b approximates these ordered pairs as well as possible in the sense that out of all choices of a and b, ∑_{i=1}^{p}^{2}is as small as possible. In other words, you want to minimize the function of two variables f≡∑_{i=1}^{p}^{2}. Find a formula for a and b in terms of the given ordered pairs. You will be finding the formula for the least squares regression line. - Let f be a function which has continuous derivatives. Show that u= fsolves the wave equation u
_{tt}− c^{2}Δu = 0. What about u= f? Here Δ u = u_{xx}. - Show that if Δu = λu where u is a function of only x, then e
^{λt}u solves the heat equation u_{t}− Δu = 0. Here Δu = u_{xx}. - Show that if f= o, then f
^{′}= 0. - Let fbe defined on ℝ
^{2}as follows. f= 1 if x≠0. Define f= 0, and f= 0 if y≠x^{2}. Show that f is not continuous atbut thatfor

an arbitrary vector. This is called a Gateaux derivative. Thus the Gateaux derivative exists atin every direction but f is not even continuous there. - Let
Show that this function is not continuous at

but that the Gateaux derivativeexists and equals 0 for every vector

. - Suppose f : ℝ
^{n}→ ℝ^{n}is one to one and continuous. Suppose also that lim_{∥x∥ →∞}= ∞. Show that f must also be onto. Hint: By invariance of domain, fis open. Show that ℝ^{n}∖fis also open. Since fis connected (by theorems on connected sets), one of these open sets is empty. - One of the big applications of the implicit function theorem is to the method of Lagrange
multipliers. The heuristic explanations usually given in beginning calculus courses are
specious. At least this is certainly true of the explanation I use all the time based on
pictures and geometric reasoning. They break down as soon as you ask the obvious question
whether there is a smooth curve through a point in the level surface. In other words,
why does the level surface even look the way we draw it in these courses? To do the
method of Lagrange multipliers correctly, you need to use some sort of big theorem and the
version involving the implicit function theorem is likely the easiest. Using the implicit
function theorem, prove the following theorem which is the general method of Lagrange
multipliers.
Theorem 3.8.1 Let U be an open subset of ℝ

^{n}and let f : U → ℝ be a C^{1}function. Then if x_{0}∈ U, has the property thatg

_{i}= 0 , i = 1,,m, g_{i}a C^{1}function, and x_{0}is either a local maximum or local minimum of f on the intersection of the level sets justdescribed, and if some m × m submatrix ofhas nonzero determinant, then there exist scalars, λ

_{1},,λ_{m}such that(3.20) Hint: Let F : U × ℝ → ℝ

^{m+1}be defined by(3.21) and if the condition holds on rank, and 3.20 fails to hold, you can use the implicit function theorem to solve for m + 1 of the x variables in terms of the others, a being one of them, these other variables being in an open set. In particular a cannot be a local extremum unless 3.20 holds.

- Suppose you have S = . We usually refer to this as a level surface in ℝ
^{n+1}and we give examples of things like ellipsoids and spheres. Then everyone is deceived into thinking they know what is going on because of the examples. After this deception, we give specious arguments to justify the method of Lagrange multipliers (I have spent my career giving such specious arguments.) by showing that the gradient of the objective function is perpendicular to the direction vector of every smooth curve lying in S at a point where the maximum or minimum exists using the chain rule. One thing which is missing in this kind of stupidity is a consideration whether there even exist such smooth curves. Questions about whether something exists are not currently in vogue. Superficial arguments and faith have supplanted such questions. Use the implicit function theorem to give conditions which imply the existence of such smooth curves at a point on S.

Download PDFView PDF