Degree

Doctor of Philosophy (PhD)

Department

Mathematics

Document Type

Dissertation

Abstract

Nonlinear optimization is a critical branch in applied mathematics and has attracted wide attention due to its popularity in practical applications. In this work, we present two methods which use first-order information to solve two typical classes of nonlinear structured optimization problems.

For a class of unconstrained nonconvex composite optimization problems where the objective is the sum of a smooth but possibly nonconvex function and a convex but possibly nonsmooth function, we propose a unified proximal gradient method with extrapolation, which provides unified treatment to convex and nonconvex problems. The method achieves the best-known convergence rate for first-order methods when solving convex optimization problems. In the case that the problem is nonconvex, the method performs as a proximal gradient method with extrapolation, and a linear convergence rate of the objective values and the generated iterates is obtained under additional proper assumptions. The efficiency of the algorithm is shown by numerical experiments.

For a family of nonconvex separable optimization problems with linear constraints where the objective function is the sum of a smooth but possibly nonconvex function and a possibly nonsmooth nonconvex function, an inexact alternating direction method of multipliers is designed. The method solves subproblems to adaptive error criteria. An expansion step and more flexible dual stepsize are exploited to accelerate the convergence of the algorithm. A linear convergence rate of the generated iterates is guaranteed under proper conditions. Numerical examples illustrate the better performance of the method compared with state-of-the-art ADMM algorithms.

Date

4-4-2023

Committee Chair

Zhang, Hongchao

DOI

10.31390/gradschool_dissertations.6098

Available for download on Wednesday, April 03, 2024

Share

COinS