# Amoeba Optimizer#

## Synopsis#

This will optimize a function using the AmoebaOptimizer class. This example demonstrates optimizing a simple paraboloid function.

## Results#

Output:

```Position: [-5.003825599641884, 6.998563761340231]
Value: 5.00002
```

## Code#

### C++#

```#include "itkAmoebaOptimizer.h"
#include "ExampleCostFunction.h"

namespace
{
// Typedef the optimizer and cost function, for convenience
using OptimizerType = itk::AmoebaOptimizer;
using CostType = itk::ExampleCostFunction2;
} // namespace

int
main()
{

// Instantiate the optimizer
auto optimizer = OptimizerType::New();

// Set properties pertinent to convergence
optimizer->SetMaximumNumberOfIterations(100);
optimizer->SetParametersConvergenceTolerance(0.01);
optimizer->SetFunctionConvergenceTolerance(0.01);

// Instantiate the cost function
// The cost function is a 2D paraboloid in the x-y plane
// with the equation f(x,y) = (x+5)^2+(y-7)^2 + 5
// and a global minimum at (x,y) = (-5, 7)
auto cost = CostType::New();

// Assign the cost function to the optimizer
optimizer->SetCostFunction(cost.GetPointer());

// Set the initial parameters of the cost function
OptimizerType::ParametersType initial(2);
initial = 123;
initial = -97.4;
optimizer->SetInitialPosition(initial);

// Begin the optimization!
optimizer->StartOptimization();

// Print out some information about the optimization
std::cout << "Position: " << optimizer->GetCurrentPosition() << std::endl;
std::cout << "Value: " << optimizer->GetValue() << std::endl;

// As expected, the position is near to (-5, 7) and the value to 5
// Position: [-5.003825599641884, 6.998563761340231]
// Value: 5.00002
return EXIT_SUCCESS;
}
```

## Classes demonstrated#

class AmoebaOptimizer : public itk::SingleValuedNonLinearVnlOptimizer

Wrap of the vnl_amoeba algorithm.

AmoebaOptimizer is a wrapper around the vnl_amoeba algorithm which is an implementation of the Nelder-Meade downhill simplex problem. For most problems, it is a few times slower than a Levenberg-Marquardt algorithm but does not require derivatives of its cost function. It works by creating a simplex (n+1 points in ND space). The cost function is evaluated at each corner of the simplex. The simplex is then modified (by reflecting a corner about the opposite edge, by shrinking the entire simplex, by contracting one edge of the simplex, or by expanding the simplex) in searching for the minimum of the cost function.

The methods AutomaticInitialSimplex() and SetInitialSimplexDelta() control whether the optimizer defines the initial simplex automatically (by constructing a very small simplex around the initial position) or uses a user supplied simplex size.

The method SetOptimizeWithRestarts() indicates that the amoeabe algorithm should be rerun after if converges. This heuristic increases the chances of escaping from a local optimum. Each time the simplex is initialized with the best solution obtained by the previous runs. The edge length is half of that from the previous iteration. The heuristic is terminated if the total number of iterations is greater-equal than the maximal number of iterations (SetMaximumNumberOfIterations) or the difference between the current function value and the best function value is less than a threshold (SetFunctionConvergenceTolerance) and max(|best_parameters_i - current_parameters_i|) is less than a threshold (SetParametersConvergenceTolerance).

ITK Sphinx Examples: