Main Content
Theses
Here, we provide potential topics for theses and a list of completed theses.
Topics for Theses
In order to be able to write a thesis supervised by a member of our working group, you have to offer knowledge according to at least one of the courses Continuous Optimization or Operations Research. Below, you can find a list of potential topics for a theses. You are also allowed to suggest alternative topics to us.
- Existence of solutions for bilevel optimization problems (topic for a BT): Optimization problems which are constrained by the parameter-dependent solution set of yet another optimization problem are referred to as bilevel optimization problems. In the thesis, this concept has to be introduced and motivated with the aid of suitable real-world applications. Then, the optimistic and pessimistic approach to bilevel optimization have to be explained. With the aid of suitable results from nonlinear parametric optimization, one has to work out conditions that ensure the existence of solutions for the optimistic and pessimistic bilevel optimization problem.
- A penalty method for simple, convex bilevel optimization (topic for a BT): In simple, convex bilevel optimization, a convex function is minimized over the solution set of yet another convex optimization problem. In the thesis, this concept has to be introduced and motivated with the aid of suitable real-world applications. Then, a penalty method for the numerical solution of such optimization problems has to be studied. The idea is to minimize the weighted sum of the two objective functions over the feasible set of the inner problem, and to increase the weight associated with the inner objective function iteratively. First, the qualitative properties of this method have to be analyzed. Second, its quantitative numerical behavior has to be tested with the aid of suitable benchmark problems.
- Solution methods for simple, convex bilevel optimization (topic for a MT): In simple, convex bilevel optimization, a convex function is minimized over the solution set of yet another convex optimization problem. In the thesis, this concept has to be introduced and motivated with the aid of suitable real-world applications. Then, two methods for the numerical solution of such optimization problems have to be compared. First, a penalty approach has to be investigated. The idea is to minimize the weighted sum of the two objective functions over the feasible set of the inner problem, and to increase the weight associated with the inner objective function iteratively. Subproblems can be solved with the aid of the proximal gradient method. A second approach combines the proximal point method for the outer minimization problem with the proximal gradient method of the inner problem. The qualitative properties of both methods have to be analyzed. Second, their quantitative numerical behavior has to be tested and compared with the aid of suitable benchmark problems.
- Optimization problems with nonsmooth norm regularization (topic for a BT or MT): One has to consider constrained optimization problems whose objective function is equipped with a 1- or \infty-norm regularization term. Such problems are nonsmooth, but, for the price of additional slack variables, can be transferred into equivalent smooth constrained optimization problems. This has to be demonstrated in the thesis. The relationship of the original problem and its reformulation has to be investigated in terms of global/local minimizers and the validity of constraint qualifications. The surrogate problems associated with suitable benchmark instances have to be solved with the aid of suitable software.
- Multiplier-penalty methods for optimization problems with switching constraints (topic for a MT): Constraints enforcing the product of two terms to be zero are referred to as switching constraints. The presence of switching constraints causes superordinate optimization problems to be rather irregular. To start the thesis, the problem class has to be motivated and its regularity behavior has to be illustrated. Afterwards, it has to be analyzed whether multiplier-penalty methods can be used to solve switching-constrained problems numerically. First, the classical multiplier-penalty method, where the switching constraints are interpreted as standard equality constraints, has to be investigated. Second, switching constraints have to be augmented in such a way that the arising subproblems comprise switching constraints on decoupled slack variables. These subproblems then can be solved with a projected gradient method. Both approaches have to be compared within a set of numerical experiments.
- A comparison of relaxation and smoothing methods for optimization problems with switching constraints (topic for a MT): Constraints enforcing the product of two terms to be zero are referred to as switching constraints. The presence of switching constraints causes superordinate optimization problems to be rather irregular. To start the thesis, the problem class has to be motivated and its regularity behavior has to be illustrated. Afterwards, two solution approaches for switching-constrained optimization problems have to be studied. The relaxation approach replaces each switching constraint by two inequality constraints which bound the product of the two terms by means of a relaxation parameter which is iteratively driven to zero. In contrast, smoothing methods replace each switching constraint by an equality constraint demanding that the product of the two terms equals the value of a smoothing parameter which is also iteratively driven to zero. Both approaches have to be compared with respect to their qualitative properties and their numerical behavior based on suitable benchmark problems.
Completed Theses (since 2024)
- The top spot on this list is still available!