I am very interested in sparsity in neural nets and I am super-happy that the community around sparsity is growing, and the first workshop on the subject had just taken place.
The key organizer is D.C.Mocanu; I did an experimental PyTorch project building upon his work a couple of years ago.
I learned about this via the ML Collective reading group mailing list. The links are in the comments.
The key organizer is D.C.Mocanu; I did an experimental PyTorch project building upon his work a couple of years ago.
I learned about this via the ML Collective reading group mailing list. The links are in the comments.
no subject
Date: 2021-07-09 11:29 pm (UTC)https://www.youtube.com/watch?v=pAOzBbMxAmc
https://www.youtube.com/watch?v=fEw2Hw013pk
no subject
Date: 2021-07-09 11:32 pm (UTC)Announcement: https://groups.google.com/g/ml-news/c/65k2O2qtzaw?pli=1
The open review page: https://openreview.net/group?id=Sparsity_in_Neural_Networks/2021/Workshop/SNN
The site itself: https://sites.google.com/view/sparsity-workshop-2021/
The site has plenty of informative tabs including https://sites.google.com/view/sparsity-workshop-2021/accepted-papers
no subject
Date: 2021-07-09 11:34 pm (UTC)https://github.com/anhinga/synapses/blob/master/regularization.md
https://anhinga-drafts.dreamwidth.org/29924.html
(cross-post: https://anhinga-drafts.livejournal.com/30356.html)
no subject
Date: 2021-07-12 04:02 am (UTC)no subject
Date: 2021-07-12 04:13 am (UTC)"Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks", https://arxiv.org/abs/2102.00554
So, yes, from an exotic field this topic became something big (although the idea of this standalone workshop still came from a rejection of a workshop proposal by one of the major conferences)
no subject
Date: 2021-07-14 04:23 am (UTC)Now this study is done, so can return to focusing on this workshop.