I am very interested in sparsity in neural nets and I am super-happy that the community around sparsity is growing, and the first workshop on the subject had just taken place.
The key organizer is D.C.Mocanu; I did an experimental PyTorch project building upon his work a couple of years ago.
I learned about this via the ML Collective reading group mailing list. The links are in the comments.
The key organizer is D.C.Mocanu; I did an experimental PyTorch project building upon his work a couple of years ago.
I learned about this via the ML Collective reading group mailing list. The links are in the comments.
no subject
Date: 2021-07-12 04:13 am (UTC)"Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks", https://arxiv.org/abs/2102.00554
So, yes, from an exotic field this topic became something big (although the idea of this standalone workshop still came from a rejection of a workshop proposal by one of the major conferences)
no subject
Date: 2021-07-14 04:23 am (UTC)Now this study is done, so can return to focusing on this workshop.