Of course, it is an interesting question whether it's worthwhile to try to upgrade rigid frameworks to more flexible situations handling tree-like data structures and such.
If the promise of "differential programming" would be realized already, I would say "no".
But in reality, at least in Julia, the insightful statements that AD (automatic differentiation) is foundational for a modern science-oriented and machine-learning oriented programming language are great, but in reality the state-of-the-art AD Julia ecosystem still contains too many defects and is difficult to use. Hopefully, this will be improve soon (and perhaps I'll participate in this improvement).
I have not tried JAX yet, so I don't know if things are better on the Python side of the fence.
And if "differential programming" state-of-the-art in the latest generation of machine learning frameworks is so-so, retrofitting the flexibility into something old and rigid like PyTorch might still be an interesting idea.
no subject
If the promise of "differential programming" would be realized already, I would say "no".
But in reality, at least in Julia, the insightful statements that AD (automatic differentiation) is foundational for a modern science-oriented and machine-learning oriented programming language are great, but in reality the state-of-the-art AD Julia ecosystem still contains too many defects and is difficult to use. Hopefully, this will be improve soon (and perhaps I'll participate in this improvement).
I have not tried JAX yet, so I don't know if things are better on the Python side of the fence.
And if "differential programming" state-of-the-art in the latest generation of machine learning frameworks is so-so, retrofitting the flexibility into something old and rigid like PyTorch might still be an interesting idea.