Date: 2020-02-08 05:32 pm (UTC)
dmm: (0)
From: [personal profile] dmm
I started to play with Julia Flux a bit on a computer, and also to read more about it.

First of all, it is based on the new source-to-source system of automatic differentiation which claims to be the best in the world at the moment (it is certainly better than anything I've seen so far):

https://github.com/FluxML/Zygote.jl

https://arxiv.org/abs/1810.07951

(See, in particular, Section 5, "Related Work", of the latest version of the paper (March 2019).)

They say that the approach can support the full flexibility and dynamism of the Julia language, including control flow, nesting, mutation, recursion, closures, data structures (structs, dictionaries, etc.), higher-order functions, and other language constructs, and that the output is given to an existing compiler to produce highly efficient differentiated code.

The Flux itself is presented as "the ML library that doesn't make you tensor".

https://github.com/FluxML/Flux.jl

So, this seems to be a perfect fit for dataflow matrix machines, where we can truly benefit from not having to contort things into fixed-shape tensors and from the ability to take gradients of arbitrary source code.

This is a great framework for many other machine learning applications as well.

Their front page is very cool:

https://fluxml.ai/
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

May 2025

S M T W T F S
    123
456 78910
11 121314151617
18192021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 30th, 2025 08:33 pm
Powered by Dreamwidth Studios