My recent writings
Aug. 10th, 2020 10:00 pmIt looks like I'll be giving a talk "Higher-order neuromorphic computations with linear streams" at "CCC 2020: Continuity, Computability, Constructivity – From Logic to Algorithms" workshop on Sep 3 (they've even peer-reviewed the 2-page extended abstract before accepting it):
github.com/anhinga/2020-notes/tree/master/CCC-2020
(It's free to register for that workshop, but it's very mathematical in spirit.)
In connection with that I've expanded my text describing possible interdisciplinary research directions in connection with my studies (it used to be 4 pages, now it's 7 pages (but in reality only 2 new pages were added, including the possible interplay with Transformers):
github.com/anhinga/2020-notes/tree/master/research-agenda
---
I posted a brief note on "the AlexNet moment" of the Transformer revolution in mid-July ( www.cs.brandeis.edu/~bukatin/transformer_revolution.html ) and started writing down various meditations on Transformers, on attention-based-models in general, and on possible ways for their interplay with DMMs:
github.com/anhinga/2020-notes/tree/master/attention-based-models
This is quite active, still ongoing 3 weeks later (so far this is just notes written in Markdown format)...
github.com/anhinga/2020-notes/tree/master/CCC-2020
(It's free to register for that workshop, but it's very mathematical in spirit.)
In connection with that I've expanded my text describing possible interdisciplinary research directions in connection with my studies (it used to be 4 pages, now it's 7 pages (but in reality only 2 new pages were added, including the possible interplay with Transformers):
github.com/anhinga/2020-notes/tree/master/research-agenda
---
I posted a brief note on "the AlexNet moment" of the Transformer revolution in mid-July ( www.cs.brandeis.edu/~bukatin/transformer_revolution.html ) and started writing down various meditations on Transformers, on attention-based-models in general, and on possible ways for their interplay with DMMs:
github.com/anhinga/2020-notes/tree/master/attention-based-models
This is quite active, still ongoing 3 weeks later (so far this is just notes written in Markdown format)...