dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga) ([personal profile] dmm) wrote 2021-08-03 07:47 pm (UTC)

Of course, it is an interesting question whether it's worthwhile to try to upgrade rigid frameworks to more flexible situations handling tree-like data structures and such.

If the promise of "differential programming" would be realized already, I would say "no".

But in reality, at least in Julia, the insightful statements that AD (automatic differentiation) is foundational for a modern science-oriented and machine-learning oriented programming language are great, but in reality the state-of-the-art AD Julia ecosystem still contains too many defects and is difficult to use. Hopefully, this will be improve soon (and perhaps I'll participate in this improvement).

I have not tried JAX yet, so I don't know if things are better on the Python side of the fence.

And if "differential programming" state-of-the-art in the latest generation of machine learning frameworks is so-so, retrofitting the flexibility into something old and rigid like PyTorch might still be an interesting idea.

Post a comment in response:

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting