dmm: (Default)
[personal profile] dmm
It looks like I'll be giving a talk "Higher-order neuromorphic computations with linear streams" at "CCC 2020: Continuity, Computability, Constructivity – From Logic to Algorithms" workshop on Sep 3 (they've even peer-reviewed the 2-page extended abstract before accepting it):

github.com/anhinga/2020-notes/tree/master/CCC-2020

(It's free to register for that workshop, but it's very mathematical in spirit.)

In connection with that I've expanded my text describing possible interdisciplinary research directions in connection with my studies (it used to be 4 pages, now it's 7 pages (but in reality only 2 new pages were added, including the possible interplay with Transformers):

github.com/anhinga/2020-notes/tree/master/research-agenda

---

I posted a brief note on "the AlexNet moment" of the Transformer revolution in mid-July ( www.cs.brandeis.edu/~bukatin/transformer_revolution.html ) and started writing down various meditations on Transformers, on attention-based-models in general, and on possible ways for their interplay with DMMs:

github.com/anhinga/2020-notes/tree/master/attention-based-models

This is quite active, still ongoing 3 weeks later (so far this is just notes written in Markdown format)...

Date: 2020-08-11 05:24 am (UTC)
juan_gandhi: (Default)
From: [personal profile] juan_gandhi

О. густо!

Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

September 2025

S M T W T F S
 1 23456
78910111213
14151617181920
21222324252627
282930    

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Dec. 29th, 2025 08:58 am
Powered by Dreamwidth Studios