Conferences
Jun. 3rd, 2023 08:54 amThe details will be in comments (I have mixed feeling about all this).
I am seeing people talking about their talks being accepted for Strange Loop. It seems that this will be the last Strange Loop conference (I have no idea why).
Terence Tao is co-organizing "AI to Assist Mathematical Reasoning" online workshop (June 12-14).
Joint conference of "Algebra and Coalgebra in Computer Science" and "Mathematical Foundations of Programming Semantics" is on June 19-23 (free for those who participate online).
I am seeing people talking about their talks being accepted for Strange Loop. It seems that this will be the last Strange Loop conference (I have no idea why).
Terence Tao is co-organizing "AI to Assist Mathematical Reasoning" online workshop (June 12-14).
Joint conference of "Algebra and Coalgebra in Computer Science" and "Mathematical Foundations of Programming Semantics" is on June 19-23 (free for those who participate online).
no subject
Date: 2023-06-20 07:30 pm (UTC)Special Session "Category Theory" in Machine Learning
Lindley 102
15:30 - 15:40
Welcome and Introduction
Fabio Zanasi
Abstract
I will introduce the session and our speakers by giving a very brief overview of the state of the art in categorical approaches to machine learning.
15:40 - 16:10
Differential Categories and Machine Learning
Jean-Simon Pacaud Lemay
Abstract
Differential categories provide the categorical semantics of differentiation in both mathematics and computer science. Differential categories have also found many interesting applications, most recently in relation to automatic differentiation and machine learning. In this talk, I will give an overview of how differential categories (including reverse differential categories and tangent categories) are used for machine learning and other related areas. I'll talk about the past, what's been done so far, the present, what's currently being worked on, and the future, what are some important questions that need to be solved.
16:10 - 16:40
A dynamic monoidal category for deep learning
Brandon Shapiro
Abstract
A gradient descender can be regarded as an "open dynamical system" in which the state of the system (parameters to a function) is updated based on external input (feedback). Open dynamical systems can be modeled using the theory of coalgebras for polynomial functors, which can be composed in network configurations to build up complex systems from simpler parts, and the structure of series and parallel composition of these systems is described by the notion of dynamic monoidal categories. I will introduce dynamic monoidal categories, describe how gradient descenders form an example, and discuss avenues for using this abstract structure to produce concrete applications.
16:40 - 17:10
Is there a place for semantics in machine learning?
Prakash Panangaden
Abstract
Machine learning, especially deep learning, has had unprecedented success and in the past decade and a half and has generated much excitement beyond the confines of academia. As practitioners of formal semantics what can we contribute? In this highly personal and opinionated talk, I will recount some successes, some promising directions and some things that I am less optimistic about. Certainly, we can try to provide proofs of correctness and gain insight into why things work or don’t work. I am less optimistic that we can come up with entirely new algorithms or techniques, but it is still worth trying.
no subject
Date: 2023-06-20 08:00 pm (UTC)Differential categories come from linear logic (!)
Very nice talk and a slide deck containing open problems and such (it would be nice to get a copy of it)
And he is working closely with Bruno Gavranović: https://twitter.com/bgavran3