Follow
Shiwei Liu
Shiwei Liu
Postdoc, IFML, VITA group, The University of Texas at Austin
Verified email at tue.nl - Homepage
Title
Cited by
Cited by
Year
Sparse evolutionary deep learning with over one million artificial neurons on commodity hardware
S Liu, DC Mocanu, ARR Matavalam, Y Pei, M Pechenizkiy
Neural Computing and Applications 33, 2589-2604, 2021
632021
Do we actually need dense over-parameterization? in-time over-parameterization in sparse training
S Liu, L Yin, DC Mocanu, M Pechenizkiy
ICML2021, International Conference on Machine Learning, 2021
542021
Sparse Training via Boosting Pruning Plasticity with Neuroregeneration
S Liu, T Chen, X Chen, Z Atashgahi, L Yin, H Kou, L Shen, M Pechenizkiy, ...
NeurIPS2021, Advances in Neural Information Processing Systems, 2021
532021
Efficient and effective training of sparse recurrent neural networks
S Liu, I Ni’mah, V Menkovski, DC Mocanu, M Pechenizkiy
Neural Computing and Applications, 1-12, 2021
28*2021
More convnets in the 2020s: Scaling up kernels beyond 51x51 using sparsity
S Liu, T Chen, X Chen, X Chen, Q Xiao, B Wu, M Pechenizkiy, D Mocanu, ...
ICLR2023, The International Conference on Learning Representations, https …, 2023
272023
Selfish sparse RNN training
S Liu, DC Mocanu, Y Pei, M Pechenizkiy
ICML2021, International Conference on Machine Learning, 2021
272021
The Unreasonable Effectiveness of Random Pruning: Return of the Most Naive Baseline for Sparse Training
S Liu, T Chen, X Chen, L Shen, DC Mocanu, Z Wang, M Pechenizkiy
ICLR2022, The International Conference on Learning Representations, 2022
262022
Topological Insights into Sparse Neural Networks
S Liu, T Van der Lee, A Yaman, Z Atashgahi, D Ferraro, G Sokar, ...
ECML2020, European Conference on Machine Learning, 2020
222020
Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity
S Liu, T Chen, Z Atashgahi, X Chen, G Sokar, E Mocanu, M Pechenizkiy, ...
ICLR2022, The International Conference on Learning Representations, 2021
192021
Achieving personalized federated learning with sparse local models
T Huang, S Liu, L Shen, F He, W Lin, D Tao
arXiv preprint arXiv:2201.11380, 2022
122022
A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
Z Atashgahi, J Pieterse, S Liu, DC Mocanu, R Veldhuis, M Pechenizkiy
Machine Learning Journal (ECML-PKDD 2022 journal track), 2019
12*2019
Learning Sparse Neural Networks for Better Generalization
S Liu
IJCAI2020 Doctoral Consortium, International Joint Conference on Artificial …, 2020
72020
On improving deep learning generalization with adaptive sparse connectivity
S Liu, DC Mocanu, M Pechenizkiy
ICML 2019 workshop of Understanding and Improving Generalization in Deep …, 2019
62019
Hierarchical semantic segmentation using psychometric learning
L Yin, V Menkovski, S Liu, M Pechenizkiy
arXiv preprint arXiv:2107.03212, 2021
32021
Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!
S Liu, T Chen, Z Zhang, X Chen, T Huang, A Jaiswal, Z Wang
ICLR2023, The International Conference on Learning Representations, 2023
22023
Ten Lessons We Have Learned in the New" Sparseland": A Short Handbook for Sparse Neural Network Researchers
S Liu, Z Wang
arXiv preprint arXiv:2302.02596, 2023
22023
Dynamic Sparse Network for Time Series Classification: Learning What to “See”
Q Xiao, B Wu, Y Zhang, S Liu, M Pechenizkiy, E Mocanu, DC Mocanu
NeurIPS2022, 36th Annual Conference on Neural Information Processing Systems, 2022
22022
Don't Be So Dense: Sparse-to-Sparse GAN Training Without Sacrificing Performance
S Liu, Y Tian, T Chen, L Shen
International Journal of Computer Vision (IJCV), 2022
22022
FreeTickets: Accurate, Robust and Efficient Deep Ensemble by Training with Dynamic Sparsity
S Liu, T Chen, Z Atashgahi, X Chen, G Sokar, E Mocanu, M Pechenizkiy, ...
arXiv preprint arXiv:2106.14568, 2021
22021
Supervised feature selection with neuron evolution in sparse neural networks
Z Atashgahi, X Zhang, N Kichler, S Liu, L Yin, M Pechenizkiy, R Veldhuis, ...
arXiv preprint arXiv:2303.07200, 2023
12023
The system can't perform the operation now. Try again later.
Articles 1–20