Parametrized Geometric Optimization
This project aims at studying algorithmic and combinatorial questions in the context of parameterized geometric optimization as it arises in several machine learning applications. Many successful algorithms used in machine learning, but also in other fields like graphics or visualization, have freely adaptable parameters and the performance of these algorithms is typically quite sensitive to the choice of the parameters. One parameter that is of particular interest in supervised learning controls the trade-off between the complexity of the model that has to be learned and the accuracy of the model on the training data. Typically this parameter-referred to as regularization parameter-appears in an optimization problem with two contradicting objectives, namely, minimizing the model complexity and minimizing the training error. The solution of this optimization problem as a function of the regularization parameter is called the regularization path. For some very popular machine learning techniques like support vector machines or support vector regression the optimization problem is a convex quadratic program and it is known that the regularization path is piecewise linear. The latter techniques have a strong geometric flavor and geometric methods can be employed to solve them. In this project we are interested in the complexity of regularization paths (or solution path if others than the regularization parameter are considered) of several learning algorithms-especially algorithms for preference analysis, and we are also interested in geometric methods to compute these paths and to apply these methods in preference analysis applications.
Prof. Dr. Joachim Giesen
DFG - Deutsche Forschungsgemeinschaft
|Laufzeit||April 2011 - April 2016|
Dipl.-Inf. Lars Kühne
|Link zur Webseite||http://theinf2.informatik.uni-jena.de/Projects.html|