[go: up one dir, main page]

Follow
Slavomír Hanzely
Slavomír Hanzely
Postdoctoral Researcher, MBZUAI
Verified email at mbzuai.ac.ae - Homepage
Title
Cited by
Cited by
Year
Lower bounds and optimal algorithms for personalized federated learning
F Hanzely, S Hanzely, S Horváth, P Richtárik
Advances in Neural Information Processing Systems 33, 2304-2315, 2020
2452020
ZeroSARAH: Efficient nonconvex finite-sum optimization with zero full gradient computation
Z Li, S Hanzely, P Richtárik
arXiv preprint arXiv:2103.01447, 2021
432021
A Damped Newton Method Achieves Global and Local Quadratic Convergence Rate
S Hanzely, D Kamzolov, D Pasechnyuk, A Gasnikov, P Richtarik, M Takac
Advances in Neural Information Processing Systems 35, 25320-25334, 2022
362022
Distributed Newton-type methods with communication compression and Bernoulli aggregation
R Islamov, X Qian, S Hanzely, M Safaryan, P Richtárik
arXiv preprint arXiv:2206.03588, 2022
222022
Adaptive learning of the optimal mini-batch size of SGD
M Alfarra, S Hanzely, A Albasyoni, B Ghanem, P Richtárik
Workshop on Optimization for Machine Learning, NeurIPS 2020, 2020
13*2020
Sketch-and-Project Meets Newton Method: Global Convergence with Low-Rank Updates
S Hanzely
arXiv preprint arXiv:2305.13082, 2023
102023
Convergence of First-Order Algorithms for Meta-Learning with Moreau Envelopes
K Mishchenko, S Hanzely, P Richtárik
arXiv preprint arXiv:2301.06806, 2023
82023
Adaptive Optimization Algorithms for Machine Learning
S Hanzely
arXiv preprint arXiv:2311.10203, 2023
62023
DAG: Projected Stochastic Approximation Iteration for DAG Structure Learning
K Ziu, S Hanzely, L Li, K Zhang, M Takáč, D Kamzolov
arXiv preprint arXiv:2410.23862, 2024
42024
Simple Stepsize for Quasi-Newton Methods with Global Convergence Guarantees
A Agafonov, V Ryspayev, S Horváth, A Gasnikov, M Takáč, S Hanzely
arXiv preprint arXiv:2508.19712, 2025
12025
Preconditioned Norms: A Unified Framework for Steepest Descent, Quasi-Newton and Adaptive Methods
A Veprikov, A Bolatov, S Horváth, A Beznosikov, M Takáč, S Hanzely
arXiv preprint arXiv:2510.10777, 2025
2025
Loss-Transformation Invariance in the Damped Newton Method
A Shestakov, S Bohara, S Horváth, M Takáč, S Hanzely
arXiv preprint arXiv:2509.25782, 2025
2025
Polyak Stepsize: Estimating Optimal Functional Values Without Parameters or Prior Knowledge
F Abdukhakimov, CA Pham, S Horváth, M Takáč, S Hanzely
arXiv preprint arXiv:2508.17288, 2025
2025
Newton Method Revisited: Global Convergence Rates up to for Stepsize Schedules and Linesearch Procedures
S Hanzely, F Abdukhakimov, M Takáč
arXiv preprint arXiv:2405.18926, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–14