The Lax Milgram theorem is a fundamental result which is useful for obtaining weak
solutions to many types of partial differential equations. It is really a general theorem in
functional analysis.
Definition 39.1.1Let A ∈ℒ
′
(V,V )
where V is a Hilbert space. Then A is said to becoercive if
is a Cauchy sequence in V. It follows x_{n}→ x ∈ V and since A is
continuous, Ax_{n}→ Ax. This shows A
(V)
is closed.
Now let R : V → V^{′} denote the Riesz map defined by Rx
(y)
=
(y,x)
. Recall that the
Riesz map is one to one, onto, and preserves norms. Therefore, R^{−1}
(A(V ))
is a closed
subspace of V. If there R^{−1}
(A (V))
≠V, then
( )
R −1(A (V ))
^{⊥}≠
{0}
. Let x ∈
( )
R −1(A (V ))
^{⊥} and x≠0. Then in particular,
0 = (x,R−1Ax ) = R (R− 1(A (x)))(x ) = A (x)(x) ≥ δ||x||2 ,
V
a contradiction to x≠0. Therefore, R^{−1}
(A (V ))
= V and so A
(V)
= R
(V)
= V^{′}.
Since A
(V)
is both closed and dense, A
(V )
= V^{′}. This shows A is onto.
If Ax = Ay, then 0 = A
(x− y)
(x − y)
≥ δ
||x − y||
_{V }^{2}, and this shows A is one to
one. This proves the theorem.
Here is a simple example which illustrates the use of the above theorem. In the
example the repeated index summation convention is being used. That is, you sum over
the repeated indices.
Example 39.1.3Let U be an open subset of ℝ^{n}and let V be a closed subspace ofH^{1}
(U)
. Let α^{ij}∈ L^{∞}
(U)
for i,j = 1,2,
⋅⋅⋅
,n. Now define A : V → V^{′}by
∫
( ij )
A (u)(v) ≡ U α (x)u,i(x) v,j(x)+ u(x)v (x )dx.
Suppose also that
αijv v ≥ δ|v|2
i j
whenever v ∈ ℝ^{n}. Then A maps V to V^{′}one to one and onto.
Here is why. It is obvious that A is in ℒ
(V,V ′)
. It only remains to verify that it is
coercive.
∫ ( )
A (u) (u) ≡ αij(x)u,i(x)u,j(x)+ u(x)u (x) dx
∫U
≥ δ|∇u (x)|2 + |u(x)|2dx
U
≥ δ ||u||21
H (U)
This proves coercivity and verifies the claim.
What has been obtained in the above example? This depends on how you choose V. In
Example 39.1.3 suppose U is a bounded open set with C^{0,1} boundary and V = H_{0}^{1}
(U)
where
H1 (U) ≡ {u ∈ H1 (U) : γu = 0} .
0
Also suppose f ∈ L^{2}
(U )
. Then you can consider F ∈ V^{′} by defining
∫
F (v) ≡ f (x )v(x)dx.
U
According to the Lax Milgram theorem and the verification of its conditions in Example
39.1.3, there exists a unique solution to the problem of finding u ∈ H_{0}^{1}
(U)
such that for
all v ∈ H_{0}^{1}
(U)
,
∫ ( ) ∫
αij (x)u,i(x)v,j(x)+ u (x)v(x) dx = f (x )v(x)dx (39.1.1)
U U
(39.1.1)
In particular, this holds for all v ∈ C_{c}^{∞}
(U )
. Thus for all such v,
∫ ( ( ) )
− αij(x)u,i(x ),j + u(x)− f (x ) v(x)dx = 0.
U
Therefore, in terms of weak derivatives,
( )
− αiju,i,j + u = f
and since u ∈ H_{0}^{1}
(U)
, it must be the case that γu = 0 on ∂U. This is why
the solution to 39.1.1 is referred to as a weak solution to the boundary value
problem
− (αij(x)u,i(x)) + u(x) = f (x),u = 0 on ∂U.
,j
Of course you then begin to ask the important question whether u really has two
derivatives. It is not immediately clear that just because −
( ij )
α (x)u,i(x )
_{,j}∈ L^{2}
(U )
it
follows that the second derivatives of u exist. Actually this will often be true and is
discussed somewhat in the next section.
Next suppose you choose V = H^{1}
(U )
and let g ∈ H^{1∕2}
(∂U )
. Define F ∈ V^{′}
by
∫ ∫
F (v) ≡ f (x)v(x)dx + g(x)γv (x )dμ.
U ∂U
Everything works the same way and you get the existence of a unique u ∈ H^{1}
(U )
such
that for all v ∈ H^{1}
(U)
,
∫ ∫ ∫
(αij(x)u (x)v (x)+ u(x)v(x))dx = f (x)v(x)dx + g(x)γv (x)dμ
U ,i ,j U ∂U
(39.1.2)
(39.1.2)
is satisfied. If you pretend u has all second order derivatives in L^{2}
(U )
and apply the
divergence theorem, you find that you have obtained a weak solution to
− (αiju ) + u = f,αiju n = g on ∂U
,i ,j ,i j
where n_{j} is the j^{th} component of n, the unit outer normal. Therefore, u is a weak
solution to the above boundary value problem.
The conclusion is that the Lax Milgram theorem gives a way to obtain existence and
uniqueness of weak solutions to various boundary value problems. The following theorem
is often very useful in establishing coercivity. To prove this theorem, here is a
definition.
Definition 39.1.4Let U be an open set and δ > 0. Then
{ ( C ) }
U δ ≡ x ∈ U : dist x,U > δ .
Theorem 39.1.5Let U be a connected bounded open set having C^{0,1}boundary suchthat for some sequence, η_{k}↓ 0,
∞
U = ∪k=1Uηk (39.1.3)
(39.1.3)
and U_{ηk}is a connected open set. Suppose Γ ⊆ ∂U has positive surface measure andthat
{ 1 }
V ≡ u ∈ H (U ) : γu = 0 a.e. on Γ .
Then the norm
|||⋅|||
given by
(∫ )
2 1∕2
|||u||| ≡ U |∇u| dx
is equivalent to the usual norm on V.
Proof: First it is necessary to verify this is actually a norm. It clearly satisfies all the
usual axioms of a norm except for the condition that
|||u|||
= 0 if and only if u = 0.
Suppose then that
|||u|||
= 0. Let δ_{0} = η_{k} for one of those η_{k} mentioned above and
define
∫
uδ(x) ≡ u (x − y)ϕδ(y)dy
B (0,δ)
where ϕ_{δ} is a mollifier having support in B
(0,δ)
. Then changing the variables, it follows
that for x ∈ U_{δ0}
∫ ∫
u (x) = u(t)ϕ (x − t)dt = u (t)ϕ (x − t)dt
δ B(x,δ) δ U δ
Therefore, u_{δ} equals a constant on U_{δ0} because U_{δ0} is a connected open set and u_{δ} is a
smooth function defined on this set which has its gradient equal to 0. By Minkowski’s
inequality,
and this converges to 0 as δ → 0 by continuity of translation in L^{2}. It follows there exists
a sequence of constants, c_{δ}≡ u_{δ}
(x)
such that
{cδ}
converges to u in L^{2}
(Uδ0)
.
Consequently, a subsequence, still denoted by u_{δ}, converges to u a.e. By Eggoroff’s
theorem there exists a set, N_{k} having measure no more than 3^{−k}m_{n}
(Uδ0)
such
that u_{δ} converges to u uniformly on N_{k}^{C}. Thus u is constant on N_{k}^{C}. Now
∑_{k}m_{n}
(Nk )
≤
1
2
m_{n}
(Uδ0)
and so there exists x_{0}∈ U_{δ0}∖∪_{k=1}^{∞}N_{k}. Therefore, if x
∈∕
N_{k}
it follows u
(x )
= u
(x0)
and so, if u
(x)
≠u
(x0)
it must be the case that x ∈∩_{k=1}^{∞}N_{k}, a
set of measure zero. This shows that u equals a constant a.e. on U_{δ0} = U_{ηk}. Since k is
arbitrary, 39.1.3 shows u is a.e. equal to a constant on U. Therefore, u equals the
restriction of a function of S to U and so γu equals this constant in L^{2}
(∂Ω)
. Since the
surface measure of Γ is positive, the constant must equal zero. Therefore,
|||⋅|||
is a
norm.
It remains to verify that it is equivalent to the usual norm. It is clear that
|||u|||
≤
||u||
_{1,2}. What about the other direction? Suppose it is not true that for some
constant, K,
||u||
_{1,2}≤ K
|||u|||
. Then for every k ∈ ℕ, there exists u_{k}∈ V such
that
||uk||1,2 > k|||uk|||.
Replacing u_{k} with u_{k}∕
||uk||
_{1,2}, it can be assumed that
||uk||
_{1,2} = 1 for all k. Therefore,
using the compactness of the embedding of H^{1}
(U)
into L^{2}
(U)
, there exists a
subsequence, still denoted by u_{k} such that
uk → u weakly in V, (39.1.4)
uk → u strongly in L2(U), (39.1.5)
|||uk||| → 0, (39.1.6)
uk → u weakly in (V,|||⋅|||). (39.1.7)
From 39.1.6 and 39.1.7, it follows u = 0. Therefore,
|uk|
_{L2(U)
}→ 0. This with 39.1.6
contradicts the fact that
||uk||
_{1,2} = 1 and this proves the equivalence of the two
norms.
The proof of the above theorem yields the following interesting corollary.
Corollary 39.1.6Let U be a connected open set with the property that for somesequence, η_{k}↓ 0,
U = ∪∞k=1Uηk
for U_{ηk}a connected open set and suppose u ∈ W^{1,p}
(U )
and ∇u = 0 a.e. Then u equals aconstant a.e.
Example 39.1.7Let U be a bounded open connected subset of ℝ^{n}and let V be a closedsubspace of H^{1}
(U)
defined by
V ≡ {u ∈ H1 (U) : γu = 0 on Γ }
where the surface measure of Γ is positive.
Let α^{ij}∈ L^{∞}
(U )
for i,j = 1,2,
⋅⋅⋅
,n and define A : V → V^{′}by
∫
A (u)(v) ≡ αij (x )u (x)v (x)dx.
U ,i ,j
for
2
αijvivj ≥ δ|v|
whenever v ∈ ℝ^{n}. Then A maps V to V^{′}one to one and onto.
This follows from Theorem 39.1.5 using the equivalent norm defined there. Define
F ∈ V^{′} by
∫ ∫
f (x)v(x)dx + g(x)γv (x)dx
U ∂U∖Γ
for f ∈ L^{2}
(U)
and g ∈ H^{1∕2}
(∂U )
. Then the equation,
Au = F in V′
which is equivalent to u ∈ V and for all v ∈ V,
∫ ∫ ∫
αij(x)u (x )v (x)dx = f (x )v(x)dx+ g(x)γv(x)dμ
U ,i ,j U ∂U ∖Γ
is a weak solution for the boundary value problem,
( ij ) ij
− α u,i ,j = f in U, α u,inj = g on ∂U ∖ Γ ,u = 0 on Γ
as you can verify by using the divergence theorem formally.