Citations; software progress
Jun. 12th, 2022 08:20 amI have always been feeling somewhat awkward that my most cited paper has been an industrial pre-machine-learning paper on how to extract geographic references from a text.
Recently, our 2009 paper in American Mathematical Monthly on non-zero self-distances has finally surpassed it in the number of citations.
Meanwhile, my two papers I like the most have exactly zero citations.
5 months in a new job: among software achievements: now I know how to take autocomputed gradients with respect to variables assembled inside nested dictionaries. So I am no longer forced to reshape complicated tree-like-structures into flat arrays in order to use differentiable programming.
As a result, I can finally experiment with DMM training using gradient methods without putting too much labor into those experiments.
🇺🇦 🇺🇦 🇺🇦 Links are in the comments 🇺🇦 🇺🇦 🇺🇦
Recently, our 2009 paper in American Mathematical Monthly on non-zero self-distances has finally surpassed it in the number of citations.
Meanwhile, my two papers I like the most have exactly zero citations.
5 months in a new job: among software achievements: now I know how to take autocomputed gradients with respect to variables assembled inside nested dictionaries. So I am no longer forced to reshape complicated tree-like-structures into flat arrays in order to use differentiable programming.
As a result, I can finally experiment with DMM training using gradient methods without putting too much labor into those experiments.
🇺🇦 🇺🇦 🇺🇦 Links are in the comments 🇺🇦 🇺🇦 🇺🇦
no subject
Date: 2022-11-20 01:29 am (UTC)https://github.com/anhinga/late-2022-julia-drafts/tree/main/dmm-port-from-clojure
500 contributions on GitHub in the last year again.
no subject
Date: 2022-11-27 08:27 am (UTC)2) The standard first self-referential experiment (network inducing a wave pattern in its own connectivity matrix) just started to work, so we can now say that a port from Clojure to Julia does exist
3) 554 contributions on GitHub in the last year at the moment (9 of them will expire soon)
no subject
Date: 2022-12-11 07:40 pm (UTC)Just finished an outline for a possible book on "Machine Learning in Julia" (I was approached and asked to explore this as a possible book project; keeping this one in a private repository for the time being).
no subject
Date: 2023-01-10 04:53 am (UTC)Not doing the book (I decided it does not make sense to undertake such a project now, and especially on the publisher's terms).
no subject
Date: 2023-01-10 05:52 am (UTC)no subject
Date: 2023-01-23 03:42 am (UTC)no subject
Date: 2023-02-18 05:42 am (UTC)no subject
Date: 2023-03-15 06:27 pm (UTC)We have a new preprint, "Safety without alignment": https://arxiv.org/abs/2303.00752
no subject
Date: 2023-03-16 04:58 pm (UTC)Полгода назад кончилась та служебная деятельность; сейчас бы она мне очень мешала...
На самом деле, это была пятница, 16-е сентября, а в понедельник, 19-го сентября Скотт Александр опубковал заметку про деятельность Януса, и, как раз, оказалось уместно полностью переключиться на новую проблематику...
no subject
Date: 2023-04-01 05:01 am (UTC)no subject
Date: 2023-04-24 02:39 am (UTC)no subject
Date: 2023-06-07 03:14 am (UTC)no subject
Date: 2023-06-25 11:18 pm (UTC)no subject
Date: 2023-07-02 01:11 pm (UTC)no subject
Date: 2023-07-13 04:28 am (UTC)no subject
Date: 2023-07-19 01:38 pm (UTC)https://github.com/anhinga/2023-notes/blob/main/non-anthropocentric-ai-safety/understanding-as-of-july-2023/README.md
no subject
Date: 2023-07-25 04:54 am (UTC)no subject
Date: 2023-08-04 12:25 am (UTC)no subject
Date: 2023-08-10 08:31 pm (UTC)It will probably reach 1500 by mid-September, then it'll stop growing (and might easily go down)
no subject
Date: 2023-08-12 11:48 pm (UTC)no subject
Date: 2023-08-29 11:06 pm (UTC)no subject
Date: 2023-09-15 10:18 pm (UTC)no subject
Date: 2023-09-24 07:32 pm (UTC)no subject
Date: 2023-10-18 12:06 am (UTC)400 karma on lesswrong, 5 posts, 105 comments
no subject
Date: 2023-11-15 03:24 pm (UTC)1360 contributions on github (the high watermark is too high for me to maintain, like I've expected, although I am committing at a good rate)
(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From: