Jan. 26th, 2020

dmm: (Default)
It seems that people are gradually getting dissatisfied with the state of machine learning frameworks (and, in particular, with how huge and unwieldy they became, while not delivering enough). The most interesting new trend comes from Julia community, in particular, in connection with their new Flux framework. I highly recommend this blog post in this sense:

julialang.org/blog/2018/12/ml-language-compiler/

Perhaps, we can finally start getting away from Python dominance in machine learning. Transition to Julia does sound attractive, it's a very nice, performant language.

***

But even inside Google Brain, it seems that people are getting fed up with TensorFlow. Here is what their new "Reformer", ai.googleblog.com/2020/01/reformer-efficient-transformer.html, (a super-efficient incarnation of the famous class of attention-based Transformer linguistic models) is using instead of TensorFlow:

This framework: github.com/google/trax

Based on this experimental thing: github.com/google/jax ("Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more"; note that this is still using the TensorFlow XLA (Accelerated Linear Algebra) domain-specific optimizing compiler for linear algebra)



Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

May 2025

S M T W T F S
    123
456 78910
11 121314151617
18192021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 11th, 2025 10:52 am
Powered by Dreamwidth Studios