Entry tags:
Sparsity in Neural Networks (first workshop with this title)
I am very interested in sparsity in neural nets and I am super-happy that the community around sparsity is growing, and the first workshop on the subject had just taken place.
The key organizer is D.C.Mocanu; I did an experimental PyTorch project building upon his work a couple of years ago.
I learned about this via the ML Collective reading group mailing list. The links are in the comments.
The key organizer is D.C.Mocanu; I did an experimental PyTorch project building upon his work a couple of years ago.
I learned about this via the ML Collective reading group mailing list. The links are in the comments.
no subject
https://www.youtube.com/watch?v=pAOzBbMxAmc
https://www.youtube.com/watch?v=fEw2Hw013pk
no subject
Announcement: https://groups.google.com/g/ml-news/c/65k2O2qtzaw?pli=1
The open review page: https://openreview.net/group?id=Sparsity_in_Neural_Networks/2021/Workshop/SNN
The site itself: https://sites.google.com/view/sparsity-workshop-2021/
The site has plenty of informative tabs including https://sites.google.com/view/sparsity-workshop-2021/accepted-papers
no subject
https://github.com/anhinga/synapses/blob/master/regularization.md
https://anhinga-drafts.dreamwidth.org/29924.html
(cross-post: https://anhinga-drafts.livejournal.com/30356.html)
no subject
no subject
"Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks", https://arxiv.org/abs/2102.00554
So, yes, from an exotic field this topic became something big (although the idea of this standalone workshop still came from a rejection of a workshop proposal by one of the major conferences)
no subject
Now this study is done, so can return to focusing on this workshop.