Seminar: Selected Topics in Deep Learning -- Equivariance & Dynamics (5 ECTS)
Winter Semester 2025
Organizer: Karnik Ram (karnik.ram@tum.de)
Preliminary meeting : 11.07.25. 14:00 - 14:30 CET. Slides
Description
The current trend in deep learning is towards scaling model size and data, and more recently test-time compute. A smaller but steady trend has been in introducing certain physics-inspired inductive biases such as symmetries and dynamics into these models. Apart from being theoretically interesting, these methods also offer the promise of creating models that are data efficient and interpretable, which is important especially for many scientific problems. In this seminar we will look at important papers in this direction specifically focusing on equivariant and dynamical models.
Format
- Students are expected to study one paper in depth, and present and lead a discussion on it. Apart from the presentation and report, students are also expected to periodically submit one-paragraph summaries of the papers discussed, and participate in the discussions.
- Sessions will be held in-person (with a remote attendance option) once every two weeks, on Tuesday afternoon (14:30 - 16:30). There will be two paper presentations in every session. There will also be a catch-up lecture on certain relevant topics from deep learning (eg. diffusion models, graph learning) at the start based on interest.
- All class-related communications are over Discord, and the summaries, presentation, and report are managed on Gradescope.
Prerequisites
A good understanding of machine learning techniques (esp. deep learning), linear algebra, and calculus. Undergraduate students who are interested in enrolling should directly contact the organizer.
Schedule
Location: MI 01.13.010 & Online
Time: 2:30 PM to 4:30 PM
Date | Paper | Presenter | Slides |
---|---|---|---|
14.10 | Introduction & Review (Online) | Karnik | |
11.11 | Neural Ordinary Differential Equations | Aniket | |
Action Matching: Learning Stochastic Dynamics from Samples | Billy | ||
25.11 | Doob's Lagrangian: A Sample-Efficient Variational Approach to Transition Path Sampling | Sergine | |
Consistent Sampling and Simulation: Molecular Dynamics with Energy-Based Diffusion Models | Hannes | ||
09.12 | Artificial Kuramoto Oscillatory Neurons | Dima | |
Space-Time Continuous PDE Forecasting using Equivariant Neural Fields | Ale | ||
20.01 | 3D Steerable CNNs: Learning Rotationally Equivariant Features in Volumetric Data | Perasolo | |
SE(3)-Stochastic Flow Matching for Protein Backbone Generation | Maximilian | ||
27.01 | Probing Equivariance and Symmetry Breaking in Convolutional Networks | Mykhailo | |
Neural Isometries: Taming Transformations for Equivariant ML | Tingwei | ||
03.02 | Flow Equivariant Recurrent Neural Networks | Sander | |
Equivariant Adaptation of Large Pretrained Models | Giacomo |