It seems that people are gradually getting dissatisfied with the state of machine learning frameworks (and, in particular, with how huge and unwieldy they became, while not delivering enough). The most interesting new trend comes from Julia community, in particular, in connection with their new Flux framework. I highly recommend this blog post in this sense:
julialang.org/blog/2018/12/ml-language-compiler/
Perhaps, we can finally start getting away from Python dominance in machine learning. Transition to Julia does sound attractive, it's a very nice, performant language.
***
But even inside Google Brain, it seems that people are getting fed up with TensorFlow. Here is what their new "Reformer", ai.googleblog.com/2020/01/reformer-efficient-transformer.html, (a super-efficient incarnation of the famous class of attention-based Transformer linguistic models) is using instead of TensorFlow:
This framework: github.com/google/trax
Based on this experimental thing: github.com/google/jax ("Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more"; note that this is still using the TensorFlow XLA (Accelerated Linear Algebra) domain-specific optimizing compiler for linear algebra)
julialang.org/blog/2018/12/ml-language-compiler/
Perhaps, we can finally start getting away from Python dominance in machine learning. Transition to Julia does sound attractive, it's a very nice, performant language.
***
But even inside Google Brain, it seems that people are getting fed up with TensorFlow. Here is what their new "Reformer", ai.googleblog.com/2020/01/reformer-efficient-transformer.html, (a super-efficient incarnation of the famous class of attention-based Transformer linguistic models) is using instead of TensorFlow:
This framework: github.com/google/trax
Based on this experimental thing: github.com/google/jax ("Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more"; note that this is still using the TensorFlow XLA (Accelerated Linear Algebra) domain-specific optimizing compiler for linear algebra)