Date: 2021-08-03 07:47 pm (UTC)
dmm: (Default)
From: [personal profile] dmm
Of course, it is an interesting question whether it's worthwhile to try to upgrade rigid frameworks to more flexible situations handling tree-like data structures and such.

If the promise of "differential programming" would be realized already, I would say "no".

But in reality, at least in Julia, the insightful statements that AD (automatic differentiation) is foundational for a modern science-oriented and machine-learning oriented programming language are great, but in reality the state-of-the-art AD Julia ecosystem still contains too many defects and is difficult to use. Hopefully, this will be improve soon (and perhaps I'll participate in this improvement).

I have not tried JAX yet, so I don't know if things are better on the Python side of the fence.

And if "differential programming" state-of-the-art in the latest generation of machine learning frameworks is so-so, retrofitting the flexibility into something old and rigid like PyTorch might still be an interesting idea.
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

May 2025

S M T W T F S
    123
456 78910
11 121314151617
18192021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 1st, 2025 12:04 am
Powered by Dreamwidth Studios