[go: up one dir, main page]

Follow
Peter Súkeník
Peter Súkeník
PhD. student, Institute of Science and Technology, Austria
Verified email at ista.ac.at
Title
Cited by
Cited by
Year
Deep neural collapse is provably optimal for the deep unconstrained features model
P Súkeník, M Mondelli, CH Lampert
Advances in Neural Information Processing Systems 36, 52991-53024, 2023
362023
Intriguing Properties of Input-dependent Randomized Smoothing
P Súkeník, A Kuvshinov, S Günnemann
Proceedings of the 39th International Conference on Machine Learning, PMLR …, 2021
362021
Average gradient outer product as a mechanism for deep neural collapse
D Beaglehole, P Súkeník, M Mondelli, M Belkin
Advances in Neural Information Processing Systems 37, 130764-130796, 2024
212024
Neural collapse vs. low-rank bias: Is deep neural collapse really optimal?
P Súkeník, C Lampert, M Mondelli
Advances in Neural Information Processing Systems 37, 138250-138288, 2024
192024
Generalization in multi-objective machine learning
P Súkeník, C Lampert
Neural Computing and Applications, 1-15, 2024
132024
The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes
P Kocsis, P Súkeník, G Brasó, M Nießner, L Leal-Taixé, I Elezi
Advances in Neural Information Processing Systems 35 (NeurIPS 2022), 2022
102022
Wide neural networks trained with weight decay provably exhibit neural collapse
A Jacot, P Súkeník, Z Wang, M Mondelli
arXiv preprint arXiv:2410.04887, 2024
82024
Neural Collapse is Globally Optimal in Deep Regularized ResNets and Transformers
P Súkeník, CH Lampert, M Mondelli
arXiv preprint arXiv:2505.15239, 2025
12025
The system can't perform the operation now. Try again later.
Articles 1–8