course consists of lectures and practical exercises
and concentrates on optimization techniques for machine learning tasks,
such as inference and learning for graphical models. The course
contains 3 parts: first, we briefly review basic convex optimization
techniques widely used in machine learning. The second part is
devoted to inference problem for graphical models: we consider a number
of existing algorithms, show their connection and their analysis from
optimization point of view. In the final third partwe consider the
parameter's learning of graphical models, both probabilistic and
- 12.06: In place of the Exercises 15.06, there will be a "Questions & Answers" session for our course. Please prepare your questions (any type, to the theory or exercises), it is a good opportunity to get answers to them.
- 9.06: On 10.06 the lecture will be given by D.Schlesinger.
- 5.06: There will be NO exercises on Monday, 8.06.
- 4.05: There will be no lecture on Wednesday, May 6, because of Dies academicus.
- 24.04: Exersice sheet 1 is extended to cover the class on 27.04.2015. Table to enter your results added. Installation script updated as well.
- Software installation readme and PDF-script of the lecture added to the page. Several typos in slides and script from Lecture 1 fixed.
- Lecture time changed: it will be held on Wed 11:10 instead of 7:30!
Lectures: Wednesday, 11:10 - 12:40, Room E001, Start: April 15, 2015
Exercises: Monday 9:20-10:50, Room E069, Start: April 20
Prerequisites: Machine Learning 1, good knowledge of math (linear algebra, optimization), programming (python/C++)
Credits: 2/2/0, oral exam. Enrolment: jExam . Attendees: max. 60
Note: Lectures are held in English.
Script: PDF-Dowload. Script and slides are protected with a password, which reads E001
Software installation readme: Download
Introduction, motivation, relation to other courses. Graphical models. [slides]||none|
|2||Inference in graphical models as integer linear program. Linear programming relaxation.[script, Sec. 1]||
(OpenGM)Introduction to OpenGM inference library, CPLEX as solver: solver parameters, scalability. [exercise][supplement][Table-for-results]|
|3||Convex Optimization. Lagrange duality, complementary slackness.[script, Sec. 1-2; slides; book-Sec.5]|
|4||Dies academicus||(Blackboard) Lagrange duality for linear programs.[exercise][Table-for-results]|
|5||Dual of LP relaxation. Reparametrization.[slides]||(Blackboard) Lagrange duality for quadratic programs. [exercise][Table-for-results]|
|6||Tree-structured graphical models. Dynamic programming.[slides]||(OpenGM) Dual Of LP Relaxation. Reparametrization. [exercise][supplement][Table-for-results]|
|7||Optimality conditions for dul MAP LP [slides]
||(Blackboard) Exercises to dynamic programming.[exercise]|
|8||Relaxation labeling and diffusion algorithms.[slides]||none |
|9||Convex Optimization. First order smooth and non-smooth optimization (gradient, coordinate descent, sub-gradient method) and their application for inference.[script, Sec. 4][slides]||Questions and Answers session|
|10||Lagrangian (dual) decomposition. Lagrange decomposition for inference with graphical models.[slides]||(OpenGM) Coordinate descent algorithms [exercise][Table-for-results]|
|11||Sub-modularity. Energy minimization as a min-cut problem.[script, Sec. 6]||
(OpenGM) Tree decomposition - use of different algorithms.[exercise][supplement][Table-for-results]|
|12||Graph cut based algorithms. [slides]||(Whiteboard) Dynamic programming, arc consistency, tree agreement - Exercises.|
|13||Binary LP relaxation as min-cut. Partial optimality.[script, Sec. 7]||(OpenGM) Graph cut based inference.[exercise][Table-for-results]|
|14||Outlook. What we have learned and advanced topics. [slides]||(Whiteboard) Binary LP relaxation as min-cut. Partial optimality.|
See also the teaching web-page of the Computer Vision Lab Dresden.
image links: crops, lake, tsukubaR, tsukubaL, tsukubaTrueDisp