Vous êtes ici : Kiosque » Annonce

Identification

Identifiant: 
Mot de passe : 

Mot de passe oublié ?
Détails d'identification oubliés ?

Annonce

27 janvier 2022

PhD proposal: Asynchronous MCMC algorithms for fast Bayesian inference


Catégorie : Doctorant


Job type: PhD proposal

Keywords: Bayesian inference, MCMC algorithms, asynchronous algorithms.

Dates: Starting date around September 2022.

Full job description: https://pthouvenin.github.io/assets/pdfs/phd_project_2022_CRIStAL_detailed.pdf

Laboratory: Centre de Recherche en Informatique, Signal et Automatique de Lille (UMR 9189 CRIStAL), Villeneuve d'Ascq, France.

Contacts: Pierre Chainais (pierre(dot)chainais(at)centralelille(dot)fr),
Pierre-Antoine Thouvenin (pierre(dash)antoine(dot)thouvenin(at)centralelille(dot)fr),

 

Project overview

This project is aimed at accelerating MCMC algorithms for fast Bayesian inference in large scale problems. Applications in astronomy (e.g., hyperspectral imaging) or in remote sensing (e.g., multimodal multi-temporal source separation) could be considered. The project is part of the ANR Chaire IA SHERLOCK led by Pierre Chainais (co-funded by ANR, ISITE, Centrale Lille Institut and Région Haut-de-France).

Many signal and image processing applications, ranging from astronomy (Abdulaziz2019, Cai2018) to remote sensing (Ghamisi2019, Borsoi2021), involve large datasets. In absence of ground truth, fast parameter inference under controlled uncertainty is critical to guarantee the quality of the resulting predictions.

Asynchronous (parallel or distributed) optimization algorithms have recently regained interest due to their potential of acceleration, in comparison with their synchronous counterparts (Hannah2017). However, optimization algorithms only bring a point estimate, such as the maximum a posteriori (MAP) estimator. Markov-chain Monte Carlo (MCMC) methods bring a richer information by sampling the posterior distribution of the model. MCMC methods are known to induce larger computational costs compared to optimization algorithms. Nevertheless, recent works at the interface between deterministic and stochastic optimization have introduced efficient samplers to address larger datasets (Durmus2018a, Vono2020). With the exception of (Terenin2020, Simsekli2018), asynchronous MCMC algorithms largely remain to be investigated.

This PhD project is aimed at studying the potential of asynchronous MCMC algorithms for fast Bayesian inference in high dimensional problems.


Full project description and application procedure

See https://pthouvenin.github.io/assets/pdfs/phd_project_2022_CRIStAL_detailed.pdf


Profile and requirements

Master 2 or last year engineering school students with major in applied mathematics, computer science or electrical engineering. The project requires a strong background in data science and/or machine learning (statistics, optimization), signal & image processing. Very good Python coding skills are expected. A B2 English level is mandatory.

Knowledge in C++ programming, as well as experience or interest in parallel/distributed code development (MPI, OpenMP, CUDA, ...) will be appreciated.


References

Abdulaziz, Abdullah et al. (2019). “Wideband Super-Resolution Imaging in Radio Interferometry via Low Rankness and Joint Average Sparsity Models (HyperSARA)”. In: Monthly Notices of the Royal Astronomical Society 489.1, pp. 1230–1248.
Cai, Xiaohao et al. (2018). “Uncertainty Quantification for Radio Interferometric Imaging – I. Proximal MCMC Methods”. In: Monthly Notices of the Royal Astronomical Society 480.3, pp. 4154–4169.
Durmus, Alain et al. (2018). “Efficient Bayesian Computation by Proximal Markov Chain Monte Carlo: When Langevin Meets Moreau”. In: SIAM J. Imaging Sci. 11.1, pp. 473–506.
Ghamisi, Pedram et al. (2019). “Multisource and Multitemporal Data Fusion in Remote Sensing: A Comprehensive Review of the State of the Art”. In: IEEE Geoscience and Remote Sensing Magazine 7.1, pp. 6–39.
Hannah, Robert et al. (2017). “More Iterations per Second, Same Quality – Why Asynchronous Algorithms May Drastically Outperform Traditional Ones”. In: arXiv: 1708.05136.
Simsekli, Umut et al. (2018). “Asynchronous Stochastic Quasi-Newton MCMC for Non-Convex Optimization”. In: International Conference on Machine Learning, pp. 4674–4683.
Terenin, Alexander et al. (2020). “Asynchronous Gibbs Sampling”. In: arXiv: 1509.08999.

 

Dans cette rubrique

(c) GdR 720 ISIS - CNRS - 2011-2022.