
We describe some of the basic tools of convex nondierentiable analysis: sub-
gradients, directional derivatives, and supporting hyperplanes, emphasizing their
geometric interpretations. We show how to compute supporting hyperplanes and
subgradients for the various specications and functionals described in previous
chapters.
Many of the speci cations and functionals that we have encountered in chapters 8{
10 are not smooth|the speci cations can have \sharp corners" and the functionals
need not be di erentiable. Fortunately, for convex sets and functionals, some of
the most important analytical tools do not depend on smoothness. In this chapter
we study these tools. Perhaps more importantly, there are simple and e ective
algorithms for convex optimization that do not require smooth constraints or dif-
ferentiable objectives. We will study some of these algorithms in the next chapter.
13.1
Subgradients
If : n
is convex and di erentiable, we have
R
!
R
( )
( ) + ( )T(
)
for all
(13.1)
z
x
r
x
z
;
x
z
:
This means that the plane tangent to the graph of at always lies below the
x
graph of . If : n
is convex, but not necessarily di erentiable, we will say
R
!
R
that
n is a subgradient of at if
g
2
R
x
( )
( ) + T(
)
for all
(13.2)
z
x
g
z
;
x
z
:
From (13.1), the gradient of a di erentiable convex function is always a subgradient.
A basic result of convex analysis is that every convex function always has at least
one subgradient at every point. We will denote the set of all subgradients of at x
as ( ), the subdierential of at .
@
x
x
293




294