You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information. |
See also: optimization, gradient descent |
The simplex algorithm is a gradient search procedure and is quite popular, since it is based on simple principles and is therefore easy to understand and implement [1] (do not confuse this simplex algorithm with the simplex method in linear programming). The idea is as follows: if the parameter to be optimized depends on k variables, then select k+1 points. These points must not form collinear vectors. Now, find the points with the worst response, the best response, and the next-to-worst response. Next, mirror the worst point about the centroid of the face spanned by the other k points. This procedure is repeated until the best response is reached. An additional rule has to be introduced in order to prevent the simplex from oscillating about a ridge: if the mirrored point remains the worst point, then use the next-to-worst point for reflection across the centroid.
For better understanding, follow the development of the optimization process in this .
Please note that the simplex optimization is prone
to get trapped into local maxima. The results of a simplex run depend on
the starting conditions. In order to raise the chance of finding the global
optimum, one should repeat several simplex runs with different starting
conditions.
[1] | J.A. Nelder, R. Mead
Computer Journal 7 (1965) 308 |
Last Update: 2004-Jul-03