Jun. 12th, 2022

dmm: (Default)
I have always been feeling somewhat awkward that my most cited paper has been an industrial pre-machine-learning paper on how to extract geographic references from a text.

Recently, our 2009 paper in American Mathematical Monthly on non-zero self-distances has finally surpassed it in the number of citations.

Meanwhile, my two papers I like the most have exactly zero citations.

5 months in a new job: among software achievements: now I know how to take autocomputed gradients with respect to variables assembled inside nested dictionaries. So I am no longer forced to reshape complicated tree-like-structures into flat arrays in order to use differentiable programming.

As a result, I can finally experiment with DMM training using gradient methods without putting too much labor into those experiments.

🇺🇦 🇺🇦 🇺🇦 Links are in the comments 🇺🇦 🇺🇦 🇺🇦

Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

December 2025

S M T W T F S
 123456
78910111213
141516 17181920
21222324252627
28293031   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 4th, 2026 09:01 pm
Powered by Dreamwidth Studios