Covariance matrix advances for machine learning
Nous vous rappelons que, afin de garantir l'accès de tous
les inscrits aux salles de réunion, l'inscription aux réunions est
gratuite mais obligatoire.
Inscriptions closes à cette réunion.
Inscriptions
44 personnes membres du GdR ISIS, et 37 personnes non membres du GdR, sont inscrits à cette réunion.
Capacité de la salle : 95 personnes.
Annonce
The estimation of covariance or correlation matrices is an old problem, very simply posed, but fundamental in many scientific fields. Research on this topic is still very active and continues to progress, in particular by raising the issue of covariance estimation for asymptotically large (or, alternatively, very few) data.
This GdR day aims to share these new advances and more particularly those arising from two specific domains: random matrix theory and differential geometry. In particular, the presentations will focus on the on-line estimation of statistical parameters of non-Gaussian distributions as well as on recent breakthroughs in random matrix theory for theoretical and applied statistical machine learning.
Organizers
- Romain Couillet (GIPSA-Lab, Grenoble, France), romain.couillet@gipsa-lab.grenoble-inp.fr
- Guillaume Ginolhac (University Savoie Mont-Blanc, France), guillaume.ginolhac@univ-smb.fr
Programme
The program of the day will include two long-talks (1h) and two short talks (30 mn). The program is the following (start at 1:30pm and expected end at 5:15pm):
- 1h30 - 1h45: Introduction
- 1h45 - 2h45: Zhou Fan, Yale University, http://www.stat.yale.edu/~zf59/, Empirical Bayes and Approximate Message Passing algorithms for PCA in high dimensions
- 2h45 - 3h15: Abla Kammoun, King Abdullah University of Science and Technology, http://www.laneas.com/abla-kammoun, Precise analysis of large-margin classifiers: A CGMT based approach.
- 3h15 - 3h30: pause
- 3h30 - 4h30: Salem Said, University of Bordeaux, https://www.ims-bordeaux.fr/fr/annuaire/5162-said-salem, Gaussian distributions in Riemannian symmetric spaces
- 4h30 - 5h00:Pedro L. C. Rodrigues, INRIA Palaiseau, Dimensionality transcending: a method for merging SPD datasets with different dimensionalities
- 5h00-5h15: closing
Résumés des contributions
Please find the following abstracts for the 4 talks:
- Zhou Fan, Yale University, http://www.stat.yale.edu/~zf59/, 1h talk: Empirical Bayes and Approximate Message Passing algorithms for PCA in high dimensions : this talk will be divided into two halves. In a first more applied half, I will describe a new empirical Bayes procedure for principal components analysis in high dimensions, which aims to learn a prior distribution for the PCs from the observed data. Its ideas are based around the Kiefer-Wolfowitz NPMLE, some basic results in asymptotic random matrix theory, and Approximate Message Passing (AMP) algorithms for Bayesian inference. I will explain the interplay between these ideas and demonstrate the method on several genetics examples. In a second more theoretical half, motivated by this application, I will then describe a general extension of AMP algorithms to a class of rotationally invariant matrices. The usual bias correction and state evolution in AMP are replaced by forms involving the free cumulants of the spectral law. I hope to explain the main ideas behind this algorithm, and connect this back to the PCA application. This is joint work with Xinyi Zhong and Chang Su.
- Abla Kammoun, King Abdullah University of Science and Technology, http://www.laneas.com/abla-kammoun, 30mn talk: Precise analysis of large-margin classifiers: A CGMT based approach :
This talk introduces a theoretical framework based on the convex gaussian min-max theorem (CGMT) for analyzing the performance of optimization based classifiers in the regime of high dimensions. The CGMT is among the emerging analytical tools for analyzing the performance of machine learning methods that involve implicit solutions to convex optimization algorithms. We develop a new extension of the CGMT that extends to non-compact optimization sets and can handle optimization problems that may be unfeasible. As an illustration, we consider the gradient descent on logistic loss, and show how the CGMT uncovers a W-shaped curve for the test performance, that is reminiscent of similar behaviors discovered in modern learning architectures.
- Salem Said, University of Bordeaux, https://www.ims-bordeaux.fr/fr/annuaire/5162-said-salem, 1h talk: Gaussian distributions in Riemannian symmetric spaces : the talk centres on a class of probability distributions, which may be defined on any Riemannian symmetric space of non-compact type, and which may be termed "Gaussian distributions" .. By definition, these are distributions such that maximum-likelihood estimation is equivalent to a Riemannian barycentre problem. The talk will motivate and introduce these distributions, and will then develop their properties, in relation with symmetric spaces, statistical inference, random matrix theory, and theta functions. It will also discuss connected problems of Bayesian inference, which lead to original results, regarding Riemannian MCMC algorithms.
- Pedro L. C. Rodrigues, INRIA Palaiseau, 30mn talk: Dimensionality transcending: a method for merging SPD datasets with different dimensionalities : in this talk, I will present a transfer learning method for datasets with different dimensionalities, coming from different experimental setups but representing the same physical phenomena. I'll focus on the case where the data points are symmetric positive definite (SPD) matrices describing the statistical behavior of EEG-based brain computer interfaces (BCI). I will show some results on time series obtained from different experimental setups (e.g., different number of electrodes, different placement of electrodes) which indicate that our proposal can be used to transfer discriminative information between BCI recordings that, in principle, would be incompatible.