Thesis and Project Topics

(sroll down for Specific topics)

We offer projects and thesis topics (“Forschungsspraktikum”, “Grosser Beleg”, “Bachelor, Master, and Diploma Thesis). The topics cover a wide range of application fields and different methodologies – see our general research overview. Each researcher describes the researcher field he is interested in below – please contact them directly. It is often best to tailor the project to your background and interests. At the end of the webpage, you can also find specific announcements of projects (often in cooperation with other teams).

Bogdan SavchynskyyOptimization and Machine Learning

My main interests are inference and learning (I&L) for graphical models with a strong accent on optimization. In particular, I invite you to participate in the hot topics of my current research, which are:

  • Solving NP-hard I&L problems exactly and efficiently
  • GPU and CPU parallelization of I&L algorithms
  • Large-scale, distributed I&L

Please see also my lecture, which gives an introduction to this field.

Uwe SchmidtBioImaging and Machine Learning

I am mainly interested in inference and learning for graphical models with continuous variables (e.g., for image intensities).
In particular, my focus is on supervised and unsupervised learning of these models in the context of low-level vision problems, such as image restoration. I recently started to work on biological data, as acquired by modern microscopy techniques.

Although I currently do not offer any specific projects or thesis topics, you can contact me if you are interested in one of these research topics.

Dmitrij SchlesingerMachine Learning

Generally, I am interested in inference and especially learning in statistical structured models, such as e.g. Markovian Random Fields (MRF). Inside this scope my interests are quite broad ranging from applications (e.g. semantic segmentation, 3D-reconstruction) to the fundamental questions like “Is the Maximum Likelihood the right choice for structured models?” etc. Last but not least, I am interested in efficient algorithms for both inference and learning. Below are topics that (I hope) are suitable for student projects and thesis:

  • An experimental comparison of maximum a-posteriori decisions and maximum marginal decisions in MRF
  • Combining Convolutional Neural Networks with Conditional Random Fields (together with Michael Yang)
  • Combining generative and discriminative learning
  • Parallelized models and algorithm for efficient learning for big data (together with Bogdan Savchinskyy)

Of course, you may also suggest your own topic. I would be glad to supervise it if it fits well in our research landscape. The themes above are especially suitable for people that attended or plan to attend Machine Learning I or even better Machine Learning II .

Florian Jug (with Gene Myers team) – Tracking and Segmentation in 2D and 3D Microscopy

The ultimate goal is to develop an accurate, flexible and interactive tracking tool which can be applied to a large variety of biological data. To build such a system poses a lot of challenging tasks in the fields of machine learning and discrete optimization, such as accurate segmentation; fast inference in graphical models, life-long learning, interactive visualization, etc.

Does this sound interesting? Just get in touch with me, I am sure we can discuss your interests and find a matching project. Such a project could span a large range from online microscopy control, theoretical contributions regarding inference and learning with graphical models, or even human-machine interfaces and visualization in the context of tracking and proofreading. I’m looking forward hearing from you!

 

Specific topics
 
Unsupervised Conditional Random Forests (please contact: Alexander Krull ) – Thesis
Random Forests are an important tool for many applications such as segmentation or (human) pose estimation. A Conditional random forest contains tress which are specialized for some variable such as the light condition or body shape. We would like to explore if conditional random forests can be trained without knowledge of this random variable and still give a boost in performance.