Posts

ICML Outstanding Paper Award
I’m delighted to share that Aaron Defazio and I received the ICML Outstanding Paper Award for our work on D-Adaptation. The associated github repository of our paper has been quite popular and we are working hard on making extensions that will make adaptive methods even more useful for deep learning. Our first extension, Prodigy, is available on github as well and has been performing even better than D-Adaptation in our experiments. Expect more updates from us pretty soon!
ICML Outstanding Paper Award
Online talk at Technology Innovation Institute
Today I’m giving an online talk at AIDRC Seminar Series of Technology Innovation Institute. The talk announcement can be found on the seminar’s website along with the abstract. In short, The topic of my presentation is our 2022 ICML paper ProxSkip and its several extension that were done by other authors.
Paper on Regularized Newton accepted at SIAM Journal on Optimization (SIOPT)
My paper on Regularized Newton got accepted for publication at SIAM Journal on Optimization (SIOPT). The main result of this work is to show that one can globalize Newton’s method by using regularization proportional to the square root of the gradient norm. The corresponding method achieves global acceleration over gradient descent and it converges with the $O(1/k^2)$ rate of cubic Newton.
Presenting at 2022 Workshop on FL and Analytics organized by Google
I’m taking part in the 2022 Workshop on Federated Learning and Analytics on 9 and 10 November. I am giving a talk about our work on Asynchronous SGD in the mini-workshop on Federated Systems at Scale on 9 November.
I'm giving a talk at Institut Henri Poincaré
I’m giving a talk at Séminaire Parisien d’Optimisation on 10 October. I will be presenting my work on second-order optimization, including the Super-Universal Newton paper.