Viscosity solutions of the Bellman equation for infinite horizon optimal control problems with negative instantaneous costs
In a series of papers, we characterized the value function in optimal control as the unique viscosity solution of the corresponding Bellman equation that satisfies appropriate side conditions. The novelty of our results was that they applied to exit time problems with general nonnegative instantaneous costs, including cases where the instantaneous cost is not uniformly bounded below by positive constants. This note will extend these results to control problems whose instantaneous costs are allowed to take both positive and negative values, including undiscounted examples. We apply our results to the generalized Zubov equation, which corresponds to the Bellman equation for a negative instantaneous cost. The unique solutions of the Zubov equations are maximum cost Lyapunov functions for perturbed asymptotically stable systems. We study the regularity of these Lyapunov functions, and we further extend Zubov's method for representing domains of attractions as sublevel sets of Lyapunov functions. We also illustrate some special properties of maximum cost Lyapunov functions that can occur when the instantaneous cost for the Lyapunov function is degenerate.
Publication Source (Journal or Book title)
Proceedings of the IEEE Conference on Decision and Control
Malisoff, M. (2002). Viscosity solutions of the Bellman equation for infinite horizon optimal control problems with negative instantaneous costs. Proceedings of the IEEE Conference on Decision and Control, 1, 722-727. Retrieved from https://digitalcommons.lsu.edu/mathematics_pubs/1027