Date: 2021-01-30 05:08 pm (UTC)
dmm: (0)
From: [personal profile] dmm
No, I am quite wrong about JAX, it should be flexible enough, see my comments at

https://anhinga-anhinga.livejournal.com/84046.html?thread=734286#t734286

Now we DO KNOW who the authors are (they are from DeepMind, and DeepMind has essentially switched to JAX):

https://arxiv.org/abs/2101.07627

I know the name of the first author from the AI-generating algorithms community, he wrote https://arxiv.org/abs/2003.03124 "Finding online neural update rules by learning to remember" and many other interesting papers.

The second author co-wrote this one: https://arxiv.org/abs/1606.02580 "Convolution by Evolution: Differentiable Pattern Producing Networks"

Unfortunately, the most stupid ICLR reviewers decided to reject this paper. Hence much more reason to pay close attention to this paper (if the mainstream of the field is stupid as usual, then one can benefit from disagreeing and paying early attention to this innovation before it becomes fashionable).
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

May 2025

S M T W T F S
    123
456 78910
11 121314151617
18192021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 18th, 2025 12:11 pm
Powered by Dreamwidth Studios