A vector forward mode of automatic differentiation for generalized derivative evaluation

TitleA vector forward mode of automatic differentiation for generalized derivative evaluation
Publication TypeJournal Article
Year of Publication2015
AuthorsKhan KA, Barton PI
JournalOptimization Methods and Software
Volume30
Issue6
Pagination1185-1212
Date Published05/2015
Abstract

Numerical methods for non-smooth equation-solving and optimization often require generalized derivative information in the form of elements of the Clarke Jacobian or the B-subdifferential. It is shown here that piecewise differentiable functions are lexicographically smooth in the sense of Nesterov, and that lexicographic derivatives of these functions comprise a particular subset of both the B-subdifferential and the Clarke Jacobian. Several recently developed methods for generalized derivative evaluation of composite piecewise differentiable functions are shown to produce identical results, which are also lexicographic derivatives. A vector forward mode of automatic differentiation (AD) is presented for evaluation of these derivatives, generalizing established methods and combining their computational benefits. This forward AD mode may be applied to any finite composition of known smooth functions, piecewise differentiable functions such as the absolute value function, min, and max, and certain non-smooth functions which are not piecewise differentiable, such as the Euclidean norm. This forward AD mode may be implemented using operator overloading, does not require storage of a computational graph, and is computationally tractable relative to the cost of a function evaluation. An implementation in C++ is discussed.

URLhttp://www.tandfonline.com/doi/full/10.1080/10556788.2015.1025400
DOI10.1080/10556788.2015.1025400