Machine Learning 1 WS 15




Dmitrij Schlesinger, Winter semester 2016/2017

A common opinion is that the main task of Machine Learning is to establish a connection between raw data and their semantically meaningful interpretations. A bit more poetic: “to teach computers to think” (or at least to understand the data). Machine Learning approaches have numerous applications in many different subject areas, especially in those where data uncertainty plays a crucial role, like e.g. Natural Language Processing, Computer Vision etc. The lecture is based on the book “Pattern Recognition and Machine Learning” by Christopher Bishop, published by Springer in 2006, but sets its own accents.

Modules:
INF-04-FG-IS INF-BAS2 INF-BAS7

Lectures: Monday, 2DS, 09:20-10:50, INF E023, Start: 10.10.2016

Exercises: Tuesday, 3DS, 11:10-12:40, INF E008 and Friday, 4DS, 13:00-14:30, INF E009, Start: 14.10 and 18.10 respectively

Prerequisites: Solid mathematical background

Note: The lectures will be held in English

Extent: 2/2/0, Exam: Oral exam after the semester

Enrollment: jExam, if does not work – email, Maximum attendees: 60


News:

01.02.2017: No exercise on 3.02.

20.01.2017: Info about oral exam. I am not in Dresden 13.02-17.02. At other times (in February/March) I am quite flexible. So if you would like to make an appointment for e.g. a “Module-exam” (that usually consists of 8SWS, i.e. two lectures), contact please the other lecturer first and just let me know. I will than agree/confirm or not. For other types of exams (e.g. ML1 only) contact me please per e-mail.

31.12.2016: The first lecture after the Christmas break will be on 09.01.2017, the first exercise — on 06.01.2017.

20.12.2016: Since the Christmas break starts already on 22.12, there will be no exercise on 23.12.

15.11.2016: No exercises on 18.11, 22.11, 25.11, no lecture on 21.11.

5.10.2016:
Welcome! On this page, all required information is provided during the semester – scripts, exercise assignments, literature, some actual informations, dates, whatever. For now, the scripts below are taken from the previous year. They will be updated during the semester, the last updated lecture is marked bold. At the first lecture on Monday 10.10, further details will be discussed.


Scripts:
Lectures:
10.10: Introduction, Probability Theory
17.10: Bayesian Decision Theory
24.10: Maximum Likelihood Principle
31.10: Cancelled, Reformationstag
07.11: Discriminative Learning
14.11: Neuron, Linear Classifiers
28.11: Exponential Family
05.12: Support Vector Machines
12.12: Empirical Risk Minimization
19.12: Kernel-PCA, AdaBoost, applet, example
09.01: Feed-Forward Neural Networks
16.01: Hopfield Networks
23.01: Clustering
30.01: Structural Models
xx.xx: Decision Forests
Exercise assignments:
14.10, 18.10: Probability Theory (en), (de)
21.10, 25.10: Bayesian Decision Theory (en), (de)
28.10, 01.11: Maximum Likelihood Principle (en), (de) + rests
04.11, 08.11: rests from the previous sheet
11.11, 15.11: Discriminative Learning (en), (de)
xx.xx, 29.11: We will finish the previous exercise sheet.
02.12, 06.12: Neuron, Linear Classifiers (en), (de)
09.12, 13.12: Exponential Family (en), (de) + SVM (en), (de)
16.12, 20.12: Rests + Empirical Risk Minimization (en), (de)
06.01, 10.01: Kernel-PCA (en), (de), AdaBoost (en), (de)
13.01, 17.01: Feed-Forward Neural Networks (en), (de)
20.01, 24.01: Hopfield Networks (en), (de)
27.01, 31.01: Clustering (en), (de)

Last touch: 01.02.2017