Abstract
This paper describes an investigation into the performance of three Ada packages for automatic differentiation. Two of these implement the forward accumulation approach while the third employs reverse accumulation. Each package is used to provide gradient information required by a number of optimization calculations, including examples of unconstrained, constrained and least-squares problems. The results show how automatic differentiation methods can be influenced in practice by the size, complexity and sparsity of a problem. They also demonstrate ways in which the methods should interface with different types of optimization procedure. Finally, and perhaps most significantly, the results show how the performance of automatic differentiation codes can depend on hardware and system software considerations that are sometimes ignored by numerical mathematicians.
A subsidiary aim of this paper is to provide a “shop window” for user friendly forms of automatic differentiation. The underlying mathematical ideas have been quite widely discussed in the literature: but their implementation and use seems to have been perceived as too difficult for the non-specialist. The examples in this paper are intended to demonstrate that this need not be the case
A subsidiary aim of this paper is to provide a “shop window” for user friendly forms of automatic differentiation. The underlying mathematical ideas have been quite widely discussed in the literature: but their implementation and use seems to have been perceived as too difficult for the non-specialist. The examples in this paper are intended to demonstrate that this need not be the case
Original language | English |
---|---|
Pages (from-to) | 47-73 |
Number of pages | 27 |
Journal | Optimization Methods and Software |
Volume | 4 |
Issue number | 1 |
DOIs | |
Publication status | Published - 1994 |