# Paper on Regularized Newton accepted at SIAM Journal on Optimization (SIOPT)

My paper on Regularized Newton got accepted for publication at SIAM Journal on Optimization (SIOPT). The main result of this work is to show that one can globalize Newton’s method by using regularization proportional to the square root of the gradient norm. The corresponding method achieves global acceleration over gradient descent and it converges with the $O(1/k^2)$ rate of cubic Newton.

##### Konstantin Mishchenko
###### Research Scientist

I study optimization and its applications in machine learning.