Page Summary
Active Entries
- 1: Helion details
- 2: "Narrow AGI" this year?
- 3: Tao on coordinate vs coordinate-free math reasoning
- 4: "Aging as a loss of goal-directedness"
- 5: New integrated mode for GPT-4 in ChatGPT+
- 6: Китайский новый год начнётся 10-го февраля
- 7: Automating the Search for Artificial Life with Foundation Models
- 8: "Anatomy of a Formal Proof"
Style Credit
- Style: Neutral Good for Practicality by
Expand Cut Tags
No cut tags
no subject
Date: 2022-06-08 02:07 pm (UTC)Second talk: Maximilian Nickel (Facebook AI Research in New York)
"Modeling Symbolic Domains via Compositional and Geometric Representations
In this talk, I will discuss representation learning in symbolic domains and how to use such models for simple reasoning tasks. I will first present compositional models of symbolic knowledge representations such as tensor-product and holographic models, discuss their connections to associative memory, and show that they are able to outperform purely symbolic methods in various deductive reasoning settings. Furthermore, I will discuss how structural properties of symbolic data such as hierarchies and cycles are connected to the geometry of a representation space and how geometric representation learning enables parsimonious models that preserve important semantic properties of the domain. Moreover, I will show how such embeddings can be applied to challenging tasks in NLP and biology. In addition, I will discuss connections of geometric representations to state-of-the-art generative models such Riemmannian continous normalizing flows and Moser flow."
no subject
Date: 2022-06-08 02:22 pm (UTC)This talk by Maximilian Nickel is interesting.
Uses this for hierarchical representations: https://en.wikipedia.org/wiki/Hyperbolic_metric_space
no subject
Date: 2022-06-08 02:55 pm (UTC)"Subgraph Aggregation Networks
Message-passing neural networks (MPNNs) are the leading architecture for deep learning on graph-structured data, in large part due to their simplicity and scalability. Unfortunately, it was shown that these architectures are limited in their expressive power. In order to gain more expressive power, a recent trend applies message-passing neural networks to subgraphs of the original graph. In this talk, I will present a representative framework of this family of methods, called Equivariant Subgraph Aggregation Networks (ESAN). The main idea behind ESAN is to represent each graph as a set of subgraphs derived from a predefined policy and to process the set of subgraphs using a suitable equivariant architecture. Our analysis shows that ESAN has favorable theoretical properties and that it performs well in practice. Following this, we will discuss some special properties of popular subgraph selection policies by connecting subgraph GNNs with previous work in equivariant deep learning."
no subject
Date: 2022-06-08 04:24 pm (UTC)Interesting talk