AuthorAu, Kelly Thurston.
Higle, Julia L.
MetadataShow full item record
PublisherThe University of Arizona.
RightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.
AbstractIn solving a mathematical program, the exact evaluation of the objective function and its subgradients can be computationally burdensome. For example, in a stochastic program, the objective function is typically defined through a multi-dimensional integration. Solution procedures for stochastic programs are usually based on functional approximation techniques, or statistical applications of subgradient methods. In this dissertation, we explore algorithms by combining functional approximation techniques with subgradient optimization methods. This class of algorithms is referred to as "inexact subgradient methods". First, we develop a basic inexact subgradient method and identify conditions under which this approach will lead to an optimal solution. We also offer an inexact subgradient algorithm by adaptively defining the steplengths via estimated bounds on the deviations from optimality. Second, we explore approaches in which functional approximation techniques can be combined with a primal-dual subgradient method. In these algorithms, the steplengths are defined via the primal and dual information. Hence suggestions to optimality can be reflected through the steplengths, as the iteration proceeds. We also incorporate space dilation operations, which stabilize the moving directions, within our basic inexact subgradient method. As an example of the applicability of these methods, we use statistically defined approximations, which are similar to those derived in Stochastic Decomposition, in some of our algorithms for the solutions of stochastic programs.
Degree ProgramSystems and Industrial Engineering