Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Last revision Both sides next revision | ||
teaching:ss2019:pgm2019 [2019/07/01 19:18] wuta |
teaching:ss2019:pgm2019 [2019/07/30 15:27] Zhenzhang Ye |
||
---|---|---|---|
Line 1: | Line 1: | ||
====== Probabilistic Graphical Models in Computer Vision (IN2329) (2h + 2h, 5 ECTS) ====== | ====== Probabilistic Graphical Models in Computer Vision (IN2329) (2h + 2h, 5 ECTS) ====== | ||
< | < | ||
- | <b> Announcement: | + | <b> Announcement: |
- | The lecture on 22.07 will be on deep Boltzmann machines presented by [[: | + | The schedule of repeat exam is announced. See below for details. |
- | There will be NO tutorial on Wednesday, 12.06.2019. Sheet5 should be submitted on 17.06.<br /> | + | There was an error in the exercise sheet 9, which is corrected now (2019.07.09 16:00). \\ |
- | There will be NO lecture on Wednesday, 24.04.2019. | + | The last lecture on 22.07 will be on deep Boltzmann machines presented by [[: |
- | </ | + | There will be NO tutorial on Wednesday, 12.06.2019. Sheet5 should be submitted on 17.06. |
- | </ | + | There will be NO lecture on Wednesday, 24.04.2019.\\ |
\\ | \\ | ||
Line 13: | Line 13: | ||
</b> | </b> | ||
</ | </ | ||
- | Several problems in computer vision can be cast as a labeling problem. Typically, such problems arise from Markov Random Field (MRF) models, which provide an elegant framework of formulating various types of labeling problems in imaging. | + | Several problems in computer vision can be cast as a labeling problem. Typically, such problems arise from Markov Random Field (undirected graphical models), which provide an elegant framework of formulating various types of labeling problems in vision. |
- | By making use of certain assumptions some „nice“ MRF models can be solved in polynomial time, whereas | + | Under certain assumptions some "nice" |
{{ : | {{ : | ||
Line 38: | Line 38: | ||
- Approximative inference techniques: | - Approximative inference techniques: | ||
* Loopy belief propagation | * Loopy belief propagation | ||
- | * Mean field, variational inference | + | * Mean field, |
* Sampling methods | * Sampling methods | ||
Line 85: | Line 85: | ||
Date: August 05th, 08:30 - 09:45. \\ | Date: August 05th, 08:30 - 09:45. \\ | ||
Place: 102, Interims Hörsaal 2 (5620.01.102) \\ | Place: 102, Interims Hörsaal 2 (5620.01.102) \\ | ||
- | The final exam will be written. | + | The final exam will be written. No cheat sheet is allowed. |
+ | |||
+ | == Repeat Exam == | ||
+ | Date: October 08th, 10:30 - 11:45.\\ | ||
+ | Place: 102, Interims Hörsaal 2 (5620.01.102) \\ | ||
+ | The repeat exam will be written. No cheat sheet is allowed. | ||
== Lecture Materials == | == Lecture Materials == | ||
Course material (slides and exercise sheets) can be accessed [[teaching: | Course material (slides and exercise sheets) can be accessed [[teaching: |