Abstract
The performance of an algorithm often largely depends on some hyper parameter which should be optimized before its usage. Since most conventional optimization methods suffer from some drawbacks, we developed an alternative way to find the best hyper parameter values. Contrary to the well known procedures, the new optimization algorithm is based on statistical methods since it uses a combination of Linear Mixed Effect Models and Response Surface Methodology techniques. In particular, the Method of Steepest Ascent which is well known for the case of an Ordinary Least Squares setting and a linear response surface has been generalized to be applicable for repeated measurements situations and for response surfaces of order o ?Ü 2.
Keywords
Related Publications
Response Surface Methodology for Optimizing Hyper Parameters
The performance of an algorithm often largely depends on some hyper parameter which should be optimized before its usage. Since most conventional optimization methods suffer fro...
A Comparison of Least Squares and Latent Root Regression Estimators
Miilticollinesrity among the columns of regressor variables is known to cause severe distortion of the least squares estimates of the parameters in a multiple linear regression ...
Least angle regression
The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to whi...
Applied Linear Regression
Preface.1 Scatterplots and Regression.1.1 Scatterplots.1.2 Mean Functions.1.3 Variance Functions.1.4 Summary Graph.1.5 Tools for Looking at Scatterplots.1.5.1 Size.1.5.2 Transfo...
Orthogonal least squares methods and their application to non-linear system identification
Identification algorithms based on the well-known linear least squares methods of<br/>gaussian elimination, Cholesky decomposition, classical Gram-Schmidt, modified<br/...
Publication Info
- Year
- 2006
- Type
- article
- Citations
- 5
- Access
- Closed