Skip to content
This repository has been archived by the owner on Oct 8, 2024. It is now read-only.
/ reading-group Public archive

Reading Group of BamlerLab @ University of Tübingen

Notifications You must be signed in to change notification settings

bamler-lab/reading-group

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 

Repository files navigation

This page has moved here

Please visit the current reading group webpage at https://bamler-lab.github.io/reading_group.

The list below is no longer being updated.

Reading Group @BamlerLab

MvL6: 4th floor seminar room (Thursday 1:30 pm ($\color{olive}{\textsf{General}}$) and Monday 3:00 pm ($\color{orange}{\textsf{Compression}}$))

The BamlerLab reading group ($\color{olive}{\textsf{General}}$) meets weekly on Thursdays to engage in a comprehensive exploration and interpretation of scholarly works. The material to be examined can be accessed at the following link: https://github.com/bamler-lab/reading-group. Your participation in the group is cordially invited, and should you choose to attend, please feel free to join in the discussion.

Additionally, we're thrilled to introduce a new reading group exclusively centered on compression topics ($\color{orange}{\textsf{Compression}}$). This exciting venture convenes every Monday. It promises to be an engaging forum for those passionate about compression algorithms and techniques.

We thank ChatGPT for writing this description.

2024

Date Moderator Title of Paper & Link to Paper talks.tue comment reading group
2024/09/12 all participants Experiences and Trends at ICML 2024 N/A $\color{olive}{\textsf{General}}$
2024/08/08 Robert Bamler Discrete Diffusion Modeling by Estimating the Ratios of the Data Distribution talks.tue $\color{olive}{\textsf{General}}$
2024/07/29 all participants Neural Discrete Representation Learning talks.tue $\color{orange}{\textsf{Compression}}$
2024/07/15 all participants The No Free Lunch Theorem, Kolmogorov Complexity, and the Role of Inductive Biases in Machine Learning talks.tue $\color{orange}{\textsf{Compression}}$
2024/07/11 Tristan Cinquin Randomized Algorithms for Matrix Computations (Chapters 3-5) talks.tue $\color{olive}{\textsf{General}}$
2024/07/04 Robert Bamler Randomized Algorithms for Matrix Computations (Chapters 1-3) talks.tue $\color{olive}{\textsf{General}}$
2024/06/24 all participants Distribution Compression in Near-linear Time talks.tue $\color{orange}{\textsf{Compression}}$
2024/06/17 all participants Estimating the Rate-Distortion Function by Wasserstein Gradient Descent talks.tue $\color{orange}{\textsf{Compression}}$
2024/05/13 all participants Lossy Image Compression with Conditional Diffusion Models talks.tue $\color{orange}{\textsf{Compression}}$
2024/05/06 all participants Out-of-Distribution Detection using Maximum Entropy Coding talks.tue $\color{orange}{\textsf{Compression}}$
2024/04/29 all participants On universal quantization talks.tue $\color{orange}{\textsf{Compression}}$
2024/04/22 all participants Universal Deep Neural Network Compression talks.tue $\color{orange}{\textsf{Compression}}$
2024/04/11 Alexander Conzelmann Intrinsic Dimensionality Explains the Effectiveness of Language Model Fine-Tuning talks.tue $\color{olive}{\textsf{General}}$
2024/04/08 all participants Lossy Compression with Gaussian Diffusion talks.tue Part 2/2 $\color{orange}{\textsf{Compression}}$
2024/04/04 Tim Xiao LoRA: Low-Rank Adaptation of Large Language Models talks.tue $\color{olive}{\textsf{General}}$
2024/03/21 Tristan Cinquin Bayesian Model Selection, the Marginal Likelihood, and Generalization talks.tue $\color{olive}{\textsf{General}}$
2024/03/18 all participants Lossy Compression with Gaussian Diffusion talks.tue Part 1/2 $\color{orange}{\textsf{Compression}}$
2024/03/14 Johannes Zenn Diffusion Schrödinger Bridge Matching talks.tue Part 2/2 $\color{olive}{\textsf{General}}$
2024/03/11 all participants Language Modeling is Compression talks.tue $\color{orange}{\textsf{Compression}}$
2024/03/07 Johannes Zenn Diffusion Schrödinger Bridge Matching talks.tue Part 1/2 $\color{olive}{\textsf{General}}$
2024/03/04 all participants Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables talks.tue $\color{orange}{\textsf{Compression}}$
2024/02/29 Alexander Conzelmann Learning Generative Models with Sinkhorn Divergences talks.tue $\color{olive}{\textsf{General}}$
2024/02/26 all participants Wasserstein Distortion: Unifying Fidelity and Realism N/A $\color{orange}{\textsf{Compression}}$
2024/02/19 all participants Compressing Images by Encoding Their Latent Representations with Relative Entropy Coding N/A $\color{orange}{\textsf{Compression}}$
2024/02/15 Robert Bamler Introduction to Optimal Transport N/A $\color{olive}{\textsf{General}}$
2024/02/08 Alexander Conzelmann Low-Precision Stochastic Gradient Langevin Dynamics N/A $\color{olive}{\textsf{General}}$
2024/01/11 Tim Xiao
&
Johannes Zenn
Experiences and Trends at NeurIPS 2023 N/A $\color{olive}{\textsf{General}}$

2023

Date Moderator Title of Paper & Link to Paper talks.tue comment reading group
2023/12/07 Robert Bamler Show and Tell Session talks.tue Show and Tell Session $\color{olive}{\textsf{General}}$
2023/11/30 Lenard Rommel Finite Volume Neural Networks for Simple Vortex Problems talks.tue Bachelor's Thesis Presentation $\color{olive}{\textsf{General}}$
2023/11/23 Johannes Zenn More Faithful Variational Inference via the Initial Distribution of Differentiable Annealed Importance Sampling talks.tue $\color{olive}{\textsf{General}}$
2023/10/26 Tristan Cinquin Regularized KL-Divergence for Well-Defined Function Space Variational Inference in BNNs talks.tue $\color{olive}{\textsf{General}}$
2023/09/07 Alexander Ludwig Neural Data Compression for Magnetic Resonance Imaging talks.tue Bachelor's Thesis Presentation $\color{olive}{\textsf{General}}$
2023/08/31 Nicolò Zottino Probabilistic Circuits talks.tue Talk $\color{olive}{\textsf{General}}$
2023/08/17 Robert Bamler Algorithms for the Communication of Samples talks.tue $\color{olive}{\textsf{General}}$
2023/08/10 Robert Bamler Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow talks.tue $\color{olive}{\textsf{General}}$
2023/07/20 Tim Xiao DreamFusion: Text-to-3D using 2D Diffusion talks.tue $\color{olive}{\textsf{General}}$
2023/07/06 Alexander Conzelmann From data to functa: Your data point is a function and you can treat it like one talks.tue $\color{olive}{\textsf{General}}$
2023/05/25 Johannes Zenn Diffusion Probabilistic Fields talks.tue $\color{olive}{\textsf{General}}$
2023/04/27 Tim Xiao

Johannes Zenn
Trading Information between Latents in Hierarchical Variational Autoencoders

Resampling Gradients Vanish in Differentiable Sequential Monte Carlo Samplers
talks.tue


talks.tue
Poster Presentation $\color{olive}{\textsf{General}}$
2023/04/12 Robert Bamler Finite Volume Neural Network: Modeling Subsurface Contaminant Transport talks.tue $\color{olive}{\textsf{General}}$
2023/03/30 Nicolò Zottino Peer-to-Peer Variational Federated Learning Over Arbitrary Graphs talks.tue $\color{olive}{\textsf{General}}$
2023/03/16 Johannes Zenn Langevin Diffusion Variational Inference talks.tue $\color{olive}{\textsf{General}}$
2023/03/02 Tim Xiao Git Re-Basin: Merging Models modulo Permutation Symmetries talks.tue $\color{olive}{\textsf{General}}$
2023/02/23 Tristan Cinquin Understanding Variational Inference in Function-Space talks.tue $\color{olive}{\textsf{General}}$
2023/02/09 Alexander Conzelmann Diffusion Probabilistic Modeling for Video Generation N/A $\color{olive}{\textsf{General}}$

About

Reading Group of BamlerLab @ University of Tübingen

Resources

Stars

Watchers

Forks