dmm: (Default)
github.com/google/learned_optimization - "Meta-learning optimizers and more with JAX"

This is used by various interesting papers including the famous "persistent evolution strategies" paper which I don't understand and "Gradients are Not All You Need" arxiv.org/abs/2111.05803 tempting paper.

Moreover, it is used by a super-interesting "Practical tradeoffs between memory, compute, and performance in learned optimizers" arxiv.org/abs/2203.11860 must-read paper, which is being published at the following conference lifelong-ml.cc/ (Conference on Lifelong Learning Agents - CoLLAs 2022, Aug 18-24)
dmm: (Default)
kidger.site/thoughts/jax-vs-julia/

Various correspondencies between JAX and Julia constructions he is listing there are quite useful for people practicing either JAX or Julia.

(I am having good time with both JAX and Julia this year.)

Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

February 2026

S M T W T F S
1234567
891011121314
1516171819 2021
22232425262728

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Apr. 12th, 2026 09:40 pm
Powered by Dreamwidth Studios