Dictionaries with array-like interface
Aug. 3rd, 2021 03:09 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
This low-key talk might have been the single most useful talk of the whole JuliaCon 2021:
and live.juliacon.org/talk/WRNAEN (that is, www.youtube.com/watch?v=Y-hAZcqAw28)
github.com/andyferris/Dictionaries.jl
I am going to recall my recent brushes with advanced dictionaries (and with their reshaping into arrays) in the comments.
Dictionaries.jl - for improved productivity and performance
pretalx.com/juliacon2021/talk/WRNAEN/and live.juliacon.org/talk/WRNAEN (that is, www.youtube.com/watch?v=Y-hAZcqAw28)
github.com/andyferris/Dictionaries.jl
I am going to recall my recent brushes with advanced dictionaries (and with their reshaping into arrays) in the comments.
no subject
Date: 2021-08-03 07:17 pm (UTC)https://github.com/anhinga/2021-notes/tree/main/research-drafts
"Dataflow matrix machines, tree-shaped flexible tensors, neural architecture search, and PyTorch" (Jan 2021)
no subject
Date: 2021-08-03 07:19 pm (UTC)no subject
Date: 2021-08-03 07:47 pm (UTC)If the promise of "differential programming" would be realized already, I would say "no".
But in reality, at least in Julia, the insightful statements that AD (automatic differentiation) is foundational for a modern science-oriented and machine-learning oriented programming language are great, but in reality the state-of-the-art AD Julia ecosystem still contains too many defects and is difficult to use. Hopefully, this will be improve soon (and perhaps I'll participate in this improvement).
I have not tried JAX yet, so I don't know if things are better on the Python side of the fence.
And if "differential programming" state-of-the-art in the latest generation of machine learning frameworks is so-so, retrofitting the flexibility into something old and rigid like PyTorch might still be an interesting idea.