Machine Learning for Computer Vision (IN2357) (2h + 2h, 5ECTS)
SS 2018, TU München
Announcements
You can use our library for the programming exercises: mlcv-tutorial
April, 17th: The rooms for both lecture and tutorial have changed. See below.
April, 20th: New frequently asked questions section. See below.
June, 25th: There will be no tutorial on Thursday, June 28.
July 15th: There will be no repeat exam in SS2018.
July 25th: No cheatsheets, calculators or other assistances are allowed in the exam.
FAQ
1. Attendance to the lecture is open for all.
2. If your pursuing degree is not in Computer Science and you want to take the exam, you should ask the administrative staff responsible for your degree whether that is possible (it most probably is).
3. If you are a LMU student and you want to take the exam, you should ask the administrative staff responsible for your degree whether that is possible (it most probably is).
4. There is no way to get extra points for your final grade, such as bonus exercises, etc.
Lecture
Location: MW 0350 (Egbert von Hoyer)
Date: Fridays, starting from April 13th
Time: 14.00 - 16.00
Lecturer: PD Dr. habil. Rudolph Triebel
SWS: 2
Tutorial
Location: Interimshörsaal 2
Date: Thursdays, starting from April 19th
Time: 16.00 - 18.00
Lecturer: John Chiotellis, Maximilian Denninger
SWS: 2
Office Hours
Location: 02.09.058
Date: Wednesdays
Time: 14.00 - 15.00
Contents
In this lecture, the students will be introduced into the most frequently used machine learning methods in computer vision and robotics applications. The major aim of the lecture is to obtain a broad overview of existing methods, and to understand their motivations and main ideas in the context of computer vision and pattern recognition.
Note that the lecture has a new module number now. In earlier semesters it was IN3200, now it is IN2357. The content is however (almost) the same. For material from previous semesters, please refer to, e.g.: WS2017
Tentative Schedule
Topic | Lecture Date | Tutorial Date |
---|---|---|
Introduction / Probabilistic Reasoning | 13.04 | 19.04 |
Regression | 20.04 | 26.04 |
Graphical Models I | 27.04 | 03.05 |
Graphical Models II | 04.05 | 10.05 |
Bagging and Boosting | 11.05 | 17.05 |
Metric Learning | 18.05 | 24.05 |
Deep Learning | 25.05 | ? |
Sequential Data / Hidden Markov Models | 1.06 | 07.06 |
Kernels and Gaussian Processes | 08.06 | 14.06 |
Clustering 1 | 15.06 | 21.06 |
Clustering 2 | 22.06 | - |
Variational Inference I | 29.06 | 05.07 |
Variational Inference II | 06.07 | 12.07 |
Sampling Methods | 13.07 | ? |
Prerequisites
Linear Algebra, Calculus and Probability Theory are essential building blocks to this course. The homework exercises do not have to be handed in. Solutions for the programming exercises will be provided in Python .
Lecture Slides
1. Introduction and Probabilistic Reasoning
2. Regression
3. Graphical Models I
4. Graphical Models II
5. Boosting and Bagging
6. Metric Learning
7. Deep Learning
Notebooks UPDATED
8. HMMs
9. Kernel Methods and Gaussian Processes
10. Clustering I: K-means and EM for GMMs
11. Clustering II: Dirichlet Process and Spectral Clustering
12. Variational Inference I: Mean Field
13. Variational Inference II: Expectation Propagation / Sampling I
14. Sampling II