Zahra Atashgahi
Cited by
Cited by
Topological insights into sparse neural networks
S Liu, T Van der Lee, A Yaman, Z Atashgahi, D Ferraro, G Sokar, ...
ECML PKDD 2020, European Conference on Machine Learning and Principles and …, 2020
Quick and robust feature selection: the strength of energy-efficient sparse training for autoencoders
Z Atashgahi, G Sokar, T van der Lee, E Mocanu, DC Mocanu, R Veldhuis, ...
Machine Learning Journal, 2021
Sparse Training via Boosting Pruning Plasticity with Neuroregeneration
S Liu, T Chen, X Chen, Z Atashgahi, L Yin, H Kou, L Shen, M Pechenizkiy, ...
Advances in Neural Information Processing Systems (NeurIPS 2021), 2021
Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity
S Liu, T Chen, Z Atashgahi, X Chen, G Sokar, E Mocanu, M Pechenizkiy, ...
arXiv preprint arXiv:2106.14568, 2021
Unsupervised Online Memory-free Change-point Detection using an Ensemble of LSTM-Autoencoder-based Neural Networks
Z Atashgahi, DC Mocanu, R Veldhuis, M Pechenizkiy
8th ACM Celebration of Women in Computing womENcourage, 2021
Abnormal Activity Detection for the Elderly People using ConvLSTM Autoencoder
E Nazerfard, Z Atashgahi, A Nadali
Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders (poster)
Z Atashgahi, GAZN Sokar, T van der Lee, E Mocanu, DC Mocanu, ...
Sparsity in Neural Networks: Advancing Understanding and Practice 2021, 2021
A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
Z Atashgahi, J Pieterse, S Liu, D Constantin Mocanu, R Veldhuis, ...
arXiv e-prints, arXiv: 1903.07138, 2021
The system can't perform the operation now. Try again later.
Articles 1–8