The aim of this Ph.D. is to investigate theoretical and practical relationships between various approaches from the deep generative modelling literature, such as the (D)VAE family [1], diffusion models [2], and normalising flows [3].
The candidate will be supervised by Xavier Alameda-Pineda and Pedro L. C. Rodrigues and work at Inria, Grenoble.
For more information, please reach us by e-mail at pedro.rodrigues@inria.fr and xavier.alameda-pineda@inria.fr with:
-- Your CV
-- Your transcripts from master studies
-- The name of one or two persons who could provide a recommandation letter for you
The aim of this Ph.D. is to investigate theoretical and practical relationships between various approaches from the deep generative modelling literature, such as the (D)VAE family [1], diffusion models [2], and normalising flows [3].
We expect that the development of such an unified framework may provide insights on the way these different models are trained, potentially reducing the amount of samples required for training. We are particularly interested in evaluating how this framework might be helpful for unsupervised domain adaptation of probabilistic models, i.e. situations where there are differences between the statistical properties of the training set and the testing set.
To that aim, one possibility is to investigate the formalism of **information geometry** [4], which represents probability distribution functions (pdf) as elements in a Riemannian manifold and provides a theoretical and practical framework to consider geodesic paths and distances between them, barycenters of a set of pdfs, etc. The compabitility of this framework with deep probabilistic models is yet to be investigated, and algorithms allowing to effectively compute such distances are yet to be developped. An intial investigation in recent literature is Arvanitidis et al. [5]
Despite being a rather methodological Ph.D. subject, we intend to validate our findings to real world applications in which the supervisors are well versed. These include speech enhancement, multi-person tracking, and brain-computer interfaces.
[1] Laurent Girin et al. “Dynamical variational autoencoders: A comprehensive review.” Foundations and Trends in Machine Learning, 2021.
[2] Song et al. "Score-Based Generative Modeling through Stochastic Differential Equations". ICLR, 2021
[3] Papamakarios et al. "Normalizing Flows for Probabilistic Modeling and Inference". Journal of Machine Learning Research, 2021.
[4] Frank Nielsen, "An Elementary Introduction to Information Geometry". Entropy, 2020. See also [
here].
[5] Arvanitidis et al. "Pulling back information geometry". AISTATS, 2022