Publication Date

1-2006

Comments

UTEP-CS-05-37a.

Published in: A. Torn and J. Zilinskas (eds.), Models and Algorithms for Global Optimization, Springer, New York, 2007, pp. 21-42.

Abstract

Most techniques for solving global optimization problems have parameters that need to be adjusted to the problem or to the class of problems: for example, in gradient methods, we can select different step sizes. When we have a single parameter (or few parameters) to choose, it is possible to empirically try many values and come up with an (almost) optimal value. Thus, in such situations, we can come up with optimal version of the corresponding technique.

In other approaches, e.g., in methods like convex underestimators, instead of selecting the value of single number-valued parameter, we have select the auxiliary function. It is not practically possible to test all possible functions, so it is not easy to come up with an optimal version of the corresponding technique.

In this paper, we show that in many such situations, natural symmetry requirements enable us either to analytically solve the problem of finding the optimal auxiliary function, or at least reduce this problem to the easier-to-solve problem of finding a few parameters.

tr05-37.pdf (196 kB)
Original file: UTEP-CS-05-37

Share

COinS