|Title||Convex and Concave Relaxations of Implicit Functions|
|Publication Type||Journal Article|
|Year of Publication||2015|
|Authors||Stuber MD, Scott JK, Barton PI|
|Journal||Optimization Methods and Software|
|Keywords||global optimization, McCormick relaxations, nonconvex programming|
A deterministic algorithm for solving nonconvex NLPs globally using a reduced-space approach is presented. These problems are encountered when real-world models are involved as nonlinear equality constraints and the decision variables include the state variables of the system. By solving the model equations for the dependent (state) variables as implicit functions of the independent (decision) variables, a significant reduction in dimensionality can be obtained. As a result, the inequality constraints and objective function are implicit functions of the independent variables, which can be estimated via a fixed-point iteration. Relying on the recently developed ideas of generalized McCormick relaxations and McCormick-based relaxations of algorithms and subgradient propagation, the development of McCormick relaxations of implicit functions is presented. Using these ideas, the reduced space, implicit optimization formulation can be relaxed. When applied within a branch-and-bound framework, finite convergence to ε-optimal global solutions is guaranteed.