This process is transparent you do not have to write any special code to use AD. The "general nonlinear" phrase means that automatic differentiation applies to problems that fmincon and fminunc solve, which are general constrained or unconstrained minimization, as opposed to linear programming or least-squares or other problem types that call other specialized solvers.Īutomatic differentiation, also called AD, is a type of symbolic derivative that transforms a function into code that calculates the function values and derivative values at particular points. In a nutshell, as long as your function is composed of elementary functions such as polynomials, trigonometric functions, and exponentials, Optimization Toolbox calculates and uses the gradients of your functions automatically, with no effort on your part. I will explain what all of those words mean. However, with R2020b, the problem-based approach uses automatic differentiation for the calculation of problem gradients for general nonlinear optimization problems.
How do you give a gradient to a solver along with the function? Until recently, you had to calculate the gradient as a separate output, with all the pain and possibility of error that entails. Optimization Toolbox algorithms are based on more sophisticated algorithms than this, yet these more sophisticated algorithms also benefit from a gradient.
This is easy to understand: the gradient points uphill, so if you travel in the opposite direction, you generally reach a minimum. You may know that solving an optimization problem, meaning finding a point where a function is minimized, is easier when you have the gradient of the function. This column is written by Alan Weiss, the writer for Optimization Toolbox documentation.