Dictionaries with array-like interface
Aug. 3rd, 2021 03:09 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
This low-key talk might have been the single most useful talk of the whole JuliaCon 2021:
and live.juliacon.org/talk/WRNAEN (that is, www.youtube.com/watch?v=Y-hAZcqAw28)
github.com/andyferris/Dictionaries.jl
I am going to recall my recent brushes with advanced dictionaries (and with their reshaping into arrays) in the comments.
Dictionaries.jl - for improved productivity and performance
pretalx.com/juliacon2021/talk/WRNAEN/and live.juliacon.org/talk/WRNAEN (that is, www.youtube.com/watch?v=Y-hAZcqAw28)
github.com/andyferris/Dictionaries.jl
I am going to recall my recent brushes with advanced dictionaries (and with their reshaping into arrays) in the comments.
no subject
Date: 2021-08-03 07:17 pm (UTC)https://github.com/anhinga/2021-notes/tree/main/research-drafts
"Dataflow matrix machines, tree-shaped flexible tensors, neural architecture search, and PyTorch" (Jan 2021)
no subject
Date: 2021-08-03 07:19 pm (UTC)no subject
Date: 2021-08-03 07:47 pm (UTC)If the promise of "differential programming" would be realized already, I would say "no".
But in reality, at least in Julia, the insightful statements that AD (automatic differentiation) is foundational for a modern science-oriented and machine-learning oriented programming language are great, but in reality the state-of-the-art AD Julia ecosystem still contains too many defects and is difficult to use. Hopefully, this will be improve soon (and perhaps I'll participate in this improvement).
I have not tried JAX yet, so I don't know if things are better on the Python side of the fence.
And if "differential programming" state-of-the-art in the latest generation of machine learning frameworks is so-so, retrofitting the flexibility into something old and rigid like PyTorch might still be an interesting idea.
no subject
Date: 2021-08-03 07:27 pm (UTC)which is presumably behind all modern implementation of persistent data structures in Clojure and other languages:
https://lampwww.epfl.ch/papers/idealhashtrees.pdf (also posted here https://hashingit.com/elements/research-resources/2001-ideal-hash-trees.pdf )
These ideal hash trees are more like Tries ("Hash Array Mapped Trie").
Array Mapped Tries(AMT), first described in "Fast and Space Efficient Trie Searches", Bagwell [2000], form the underlying data structure.
no subject
Date: 2021-08-03 07:30 pm (UTC)with "PersistentHashMap" and such, with the foundational file being "src/BitmappedVectorTrie.jl".