dmm: (0)
Dataflow matrix machines (by Anhinga anhinga) ([personal profile] dmm) wrote 2021-01-30 05:08 pm (UTC)

No, I am quite wrong about JAX, it should be flexible enough, see my comments at

https://anhinga-anhinga.livejournal.com/84046.html?thread=734286#t734286

Now we DO KNOW who the authors are (they are from DeepMind, and DeepMind has essentially switched to JAX):

https://arxiv.org/abs/2101.07627

I know the name of the first author from the AI-generating algorithms community, he wrote https://arxiv.org/abs/2003.03124 "Finding online neural update rules by learning to remember" and many other interesting papers.

The second author co-wrote this one: https://arxiv.org/abs/1606.02580 "Convolution by Evolution: Differentiable Pattern Producing Networks"

Unfortunately, the most stupid ICLR reviewers decided to reject this paper. Hence much more reason to pay close attention to this paper (if the mainstream of the field is stupid as usual, then one can benefit from disagreeing and paying early attention to this innovation before it becomes fashionable).

Post a comment in response:

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting