Dmitry Grishchenko - Proximal optimization with automatic dimension reduction for large-scale learning

15:00
Tuesday
3
Nov
2020
Organized by: 
Dmitry Grishchenko
Speaker: 
Dmitry Grishchenko
Teams: 

Due to the Covid-19, we decided to make the defense fully online using zoom :

https://univ-grenoble-alpes-fr.zoom.us/j/94395402286?pwd=UVJtZExEdEZhUzB...

Jury committee:

  • Jérôme Malick, CNRS, LJK, Thesis director
  • Pascal Bianchi, Professor, Télécom ParisTech, Referee
  • Peter Richtárik, Professor, KAUST, Referee
  • Julien Mairal, Research Scientist, Inria Grenoble, Examiner
  • Massih-Reza Amini, Professor, Université Grenoble Alpes, Thesis co-director
  • Aleksander Gasnikov, Associate Professor, MIPT, Examiner
  • Samuel Vaiter, CNRS, Délégation Centre-Est, Examiner
  • Frank Iutzeler, Maître de Conférences, Université Grenoble Alpes, Thesis co-director

 

In this thesis, we develop a framework to reduce the dimensionality of composite optimizationproblems with sparsity inducing regularizers. Based on the identification property ofproximal methods, we first develop a ``sketch-and-project” method that uses projectionsbased on the structure of the correct point. This method allows to work with randomlow-dimensional subspaces instead of considering the full space in the cases when thefinal solution is sparse. Second, we place ourselves in the context of the delay-tolerantasynchronous proximal methods and use our dimension reduction technique to decreasethe total size of communications. However, this technique is proven to converge only forwell-conditioned problems both in theory in practice. Thus, we investigate wrapping it upinto a proximal reconditioning framework. This leads to a theoretically backed algorithmthat is guaranteed to cost less in terms of communications compared with a non-sparsifiedversion; we show in practice that it implies faster runtime convergence when the sparsityof the problem is sufficiently big.