4.7. A COFACTOR IDENTITY 99
where the a j are the coefficients in the characteristic polynomial for A and ak = 0 fork > m,am ̸= 0. The constant term of this polynomial in λ must be nonzero for all ε smallenough because it is of the form
(−1)mε
mam +(higher order terms in ε) = εm [am (−1)m + εC (ε)]
which is nonzero for all positive but very small ε. Thus εI +A is invertible for all ε smallenough but nonzero. ■
Recall that for A an p× p matrix, cof(A)i j is the determinant of the matrix which resultsfrom deleting the ith row and the jth column and multiplying by (−1)i+ j. In the proof andin what follows, I am using Dg to equal the matrix of the linear transformation Dg takenwith respect to the usual basis on Rp. Thus (Dg)i j = ∂gi/∂x j where g = ∑i giei for the eithe standard basis vectors.
Lemma 4.7.2 Let g : U → Rp be C2 where U is an open subset of Rp. Then
p
∑j=1
cof(Dg)i j, j = 0,
where here (Dg)i j ≡ gi, j ≡ ∂gi∂x j
. Also, cof(Dg)i j =∂ det(Dg)
∂gi, j.
Proof: From the cofactor expansion theorem,
δ k j det(Dg) =p
∑i=1
gi,k cof(Dg)i j (4.9)
This is because if k ̸= j, that on the right is the cofactor expansion of a determinant withtwo equal columns while if k = j, it is just the cofactor expansion of the determinant. Inparticular,
∂ det(Dg)∂gi, j
= cof(Dg)i j (4.10)
which shows the last claim of the lemma. Assume that Dg(x) is invertible to begin with.Differentiate 4.9 with respect to x j and sum on j using the chain rule in Proposition 4.4.1.Note detDg is a function of the gr,s which are functions of the xk. This yields
∑r,s, j
δ k j∂ (detDg)
∂gr,sgr,s j = ∑
i jgi,k j (cof(Dg))i j +∑
i jgi,k cof(Dg)i j, j .
Hence, using δ k j = 0 if j ̸= k and 4.10,
∑rs(cof(Dg))rs gr,sk = ∑
rsgr,ks (cof(Dg))rs +∑
i jgi,kcof(Dg)i j, j .
Subtracting the first sum on the right from both sides and using the equality of mixedpartials,
∑i
gi,k
(∑
j(cof(Dg))i j, j
)= 0.
Since it is assumed Dg is invertible, this shows ∑ j (cof(Dg))i j, j = 0. If det(Dg) = 0, useLemma 4.7.1 to let
gk (x) = g(x)+ εkx