12.1 The Derivative Of Functions Of One Variable, o
(v )
First consider the notion of the derivative of a function of one variable.
Observation 12.1.1Suppose a function f of one variable has a derivative at x. Then
|f (x-+h-)−-f (x)−-f′(x)h|
lih→m0 |h| = 0.
This observation follows from the definition of the derivative of a function of one variable,namely
f ′(x) ≡ lim f-(x-+-h)−-f (x).
h→0 h
Thus
|f (x +h )− f (x)− f′(x)h| ||f (x+ h)− f (x) ||
lihm→0------------------------= hli→m0 ||--------------− f′(x)|| = 0
|h| h
Definition 12.1.2A vector valued function of a vector v is called o
(v)
(referred to as “little o of v”)if
o (v )
l|vim|→0 -|v|-= 0. (12.1)
(12.1)
Thus for a function of one variable, the function f
(x +h )
− f
(x)
− f^{′}
(x)
h is o
(h)
. When we say a
function is o
(h)
, it is used like an adjective. It is like saying the function is white or black or green or fat or
thin. The term is used very imprecisely. Thus in general,
When you add two functions with the property of the above definition, you get another one having
that same property. When you multiply by 45, the property is also retained, as it is when you
subtract two such functions. How could something so sloppy be useful? The notation is useful
precisely because it prevents you from obsessing over things which are not relevant and should be
ignored.
Theorem 12.1.3Let f :
(a,b)
→ ℝ be a function of one variable. Then f^{′}
(x)
exists if and only if thereexists p such that
f (x+ h) − f (x) = ph +o (h ) (12.2)
(12.2)
In this case, p = f^{′}
(x)
.
Proof:From the above observation it follows that if f^{′}
and that in fact this limit exists which shows that p = f^{′}
(x)
. ■
This theorem shows that one way to define f^{′}
(x)
is as the number p, if there is one, which has the
property that
f (x+ h) = f (x)+ ph+ o(h).
You should think of p as the linear transformation resulting from multiplication by the 1 × 1 matrix
(p)
.
Example 12.1.4Let f
(x)
= x^{3}. Find f^{′}
(x)
.
f (x+ h) = (x +h )3 = x3 + 3x2h+ 3xh2 + h3
= f (x)+ 3x2h+ (3xh+ h2)h.
Since
( 2)
3xh + h
h = o
(h)
, it follows f^{′}
(x)
= 3x^{2}.
Example 12.1.5Let f
(x)
= sin
(x)
. Find f^{′}
(x)
.
f (x + h)− f (x) = sin (x+ h)− sin (x) = sin (x)cos(h)+ cos(x)sin (h) − sin(x)
(cos(h)−-1)
= cos(x)sin (h) +sin(x) h h
(sin-(h-)−-h) (cos(h-)−-1)
= cos(x)h+ cos(x) h h+ sin x h h.
Now
cos(x ) (sin(h)−-h)h + sinx (cos(h)−-1)h = o(h). (12.3)
h h
(12.3)
Remember the fundamental limits which allowed you to find the derivative of sin
(x)
were
sin(h)- cos(h)−-1
lhim→0 h = 1, lihm→0 h = 0. (12.4)
(12.4)
These same limits are what is needed to verify (12.3).
How can you tell whether a function of two variables
(u,v)
is o
( )
u
v
? In general, there is no
substitute for the definition, but you can often identify this property by observing that the expression
involves only “higher order terms”. These are terms like u^{2}v,uv,v^{4}, etc. If you sum the exponents on the u
and the v you get something larger than 1. For example,
| |
||√--vu---||≤ 1 (u2 + v2)√--1----= 1∘u2-+-v2-
| u2 + v2| 2 u2 + v2 2
and this converges to 0 as
(u,v)
→
(0,0)
. This follows from the inequality
|uv|
≤
12
( )
u2 + v2
which
you can verify from
(u − v)
^{2}≥ 0. Similar considerations apply in higher dimensions also. In
general, this is a hard question because it involves a limit of a function of many variables.
Furthermore, there is really no substitute for answering this question, because its resolution
involves the definition of whether a function is differentiable. That may be why we spend most of
our time on one dimensional considerations which involve taking the partial derivatives. The
following exercises should help give you an idea of how to determine whether something is
o.