End-to-end training of sparse deep neural networks with little-to-no performance loss.
-
Updated
Jan 26, 2023 - Python
End-to-end training of sparse deep neural networks with little-to-no performance loss.
Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
[ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy
[ICML2022] Training Your Sparse Neural Network Better with Any Mask. Ajay Jaiswal, Haoyu Ma, Tianlong Chen, ying Ding, and Zhangyang Wang
[Machine Learning Journal (ECML-PKDD 2022 journal track)] Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders
[IJCAI 2022] "Dynamic Sparse Training for Deep Reinforcement Learning" by Ghada Sokar, Elena Mocanu , Decebal Constantin Mocanu, Mykola Pechenizkiy, and Peter Stone.
[ICML 2021] "Selfish Sparse RNN Training" by Shiwei Liu, Decebal Constantin Mocanu, Yulong Pei, Mykola Pechenizkiy
Code for "Training Adversarially Robust Sparse Networks via Bayesian Connectivity Sampling" [ICML 2021]
Implementation for the paper "SpaceNet: Make Free Space For Continual Learning" in PyTorch.
[TMLR] Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks
PyTorch Implementation of TopKAST
[Machine Learning Journal (ECML-PKDD 2022 journal track)] A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
This is the repository for the SNN-22 Workshop paper on "Generalization and Memorization in Sparse Neural Networks".
Add a description, image, and links to the sparse-training topic page so that developers can more easily learn about it.
To associate your repository with the sparse-training topic, visit your repo's landing page and select "manage topics."