Active Entries
- 1: "Narrow AGI" this year?
- 2: Tao on coordinate vs coordinate-free math reasoning
- 3: "Aging as a loss of goal-directedness"
- 4: New integrated mode for GPT-4 in ChatGPT+
- 5: Китайский новый год начнётся 10-го февраля
- 6: Automating the Search for Artificial Life with Foundation Models
- 7: "Anatomy of a Formal Proof"
- 8: C to safe Rust automatic translation using LLMs and dynamic analysis
- 9: GonzoML
- 10: Transformers as a Computational Model (workshop)
Style Credit
- Style: Neutral Good for Practicality by
Expand Cut Tags
No cut tags
no subject
Date: 2020-02-08 05:32 pm (UTC)First of all, it is based on the new source-to-source system of automatic differentiation which claims to be the best in the world at the moment (it is certainly better than anything I've seen so far):
https://github.com/FluxML/Zygote.jl
https://arxiv.org/abs/1810.07951
(See, in particular, Section 5, "Related Work", of the latest version of the paper (March 2019).)
They say that the approach can support the full flexibility and dynamism of the Julia language, including control flow, nesting, mutation, recursion, closures, data structures (structs, dictionaries, etc.), higher-order functions, and other language constructs, and that the output is given to an existing compiler to produce highly efficient differentiated code.
The Flux itself is presented as "the ML library that doesn't make you tensor".
https://github.com/FluxML/Flux.jl
So, this seems to be a perfect fit for dataflow matrix machines, where we can truly benefit from not having to contort things into fixed-shape tensors and from the ability to take gradients of arbitrary source code.
This is a great framework for many other machine learning applications as well.
Their front page is very cool:
https://fluxml.ai/