dmm: (Default)
github.com/google/learned_optimization - "Meta-learning optimizers and more with JAX"

This is used by various interesting papers including the famous "persistent evolution strategies" paper which I don't understand and "Gradients are Not All You Need" arxiv.org/abs/2111.05803 tempting paper.

Moreover, it is used by a super-interesting "Practical tradeoffs between memory, compute, and performance in learned optimizers" arxiv.org/abs/2203.11860 must-read paper, which is being published at the following conference lifelong-ml.cc/ (Conference on Lifelong Learning Agents - CoLLAs 2022, Aug 18-24)
dmm: (Default)
kidger.site/thoughts/jax-vs-julia/

Various correspondencies between JAX and Julia constructions he is listing there are quite useful for people practicing either JAX or Julia.

(I am having good time with both JAX and Julia this year.)

Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

December 2025

S M T W T F S
 123456
78910111213
141516 17181920
21222324252627
28293031   

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 3rd, 2026 07:45 am
Powered by Dreamwidth Studios