Abstract
Engineering systems are now frequently optimized via computer models.The inputoutput relationships in these models are often highly nonlinear deterministic functions that are expensive to compute.Thus, when searching for the global optimum, it is desirable to minimize the number of function evaluations.Bayesian global optimization methods are well-suited to this task because they make use of all previous evaluations in selecting the next search point.A statistical model is fit to the sampled points which allows predictions to be made elsewhere, along with a measure of possible prediction error (uncertainty).The next point is chosen to maximize a criterion that balances searching where the predicted value of the function is good (local search) with searching where the uncertainty of prediction is large (global search).We extend this methodology in several ways.First, we introduce a parameter that controls the local-global balance.Secondly, we propose a method for dealing with nonlinear inequality constraints from additional response variables.Lastly, we adapt the sequential algorithm to proceed in stages rather than one point at a time.The extensions are illustrated using a shape optimization problem from the automotive industry. Introduction.Global optimization via a computer model (sometimes called a computer code) is a problem encountered frequently in engineering.In this article, for example, we will discuss the optimization of the shape of an automobile piston.The inputs to the piston model are parameters describing the piston shape.The outputs are quality characteristics: undesirable piston motion (which causes noise) and the maximum pressure between the piston and the bore (which affects wear).The objective is to find the combination of shape parameters that minimizes maximum pressure subject to a constraint on motion.When function evaluations are fairly expensive, as here, there is a need to use optimization methods that require few evaluations.We shall see that the objective, maximum pressure, is highly nonlinear in the shape parameters; hence, some care is also necessary to find the global optimum.
Keywords
Affiliated Institutions
Related Publications
Computer experiments and global optimization
A complex mathematical model that produces output values from input values is now commonly called a computer model. This thesis considers the problem of finding the global optim...
A genetic local search algorithm for solving symmetric and asymmetric traveling salesman problems
The combination of local search heuristics and genetic algorithms is a promising approach for finding near-optimum solutions to the traveling salesman problem (TSP). An approach...
Line Search Filter Methods for Nonlinear Programming: Motivation and Global Convergence
Line search methods are proposed for nonlinear programming using Fletcher and Leyffer's filter method [Math. Program., 91 (2002), pp. 239--269], which replaces the traditional m...
Evolutionary programming made faster
Evolutionary programming (EP) has been applied with success to many numerical and combinatorial optimization problems in recent years. EP has rather slow convergence rates, howe...
A multi-objective genetic local search algorithm and its application to flowshop scheduling
We propose a hybrid algorithm for finding a set of nondominated solutions of a multi objective optimization problem. In the proposed algorithm, a local search procedure is appli...
Publication Info
- Year
- 1998
- Type
- book-chapter
- Pages
- 11-25
- Citations
- 425
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1214/lnms/1215456182