Title

Viscosity solutions of the Bellman equation for exit time optimal control problems with vanishing Lagrangians

Document Type

Article

Publication Date

1-1-2002

Abstract

We study the Hamilton-Jacobi-Bellman equation for undiscounted exit time optimal control problems for fully nonlinear systems and fully nonlinear singular Lagrangians using the dynamic programming approach. We prove a local uniqueness theorem characterizing the value functions for these problems as the unique viscosity solutions of the corresponding Hamilton-Jacobi-Bellman equations that satisfy appropriate boundary conditions. The novelty of this theorem is in the relaxed hypotheses on the lower bound on the Lagrangian and the very general assumptions on the target set. As a corollary, we show that the value function for the Fuller problem is the unique viscosity solution of the corresponding Hamilton-Jacobi-Bellman equation that vanishes at the origin and satisfies certain growth conditions. This implies as special cases first that the value function of this problem is the unique proper viscosity solution of the corresponding Hamilton-Jacobi-Bellman equation, in the class of all functions which are continuous in the plane and null at the origin, and second that this value function is the unique viscosity solution of that equation in a class which includes functions which are not bounded below. We also apply our results to the degenerate eikonal equation of geometric optics and to the shape-from-shading equations in image processing. Our theorem also applies to problems with noncompact targets and unbounded control sets whose Lagrangians take negative values.

Publication Source (Journal or Book title)

SIAM Journal on Control and Optimization

First Page

1358

Last Page

1383

This document is currently not available here.

COinS