Neuroblox.jl: biomimetic modeling of neural control circuits 07-26, 15:00–15:30 (US/Eastern), 32-D463 (Star)
Neuroblox.jl is a Julia module designed for computational neuroscience and psychiatry applications. Our tools range from control circuit system identification to brain circuit simulations bridging scales from spiking neurons to fMRI-derived circuits, parameter-fitting models to neuroimaging data, interactions between the brain and other physiological systems, experimental optimization, and scientific machine learning.
Neuroblox.jl is based on a library of modular computational building blocks (“blox”) in the form of systems of symbolic dynamic differential equations that can be combined to describe large-scale brain dynamics. Once a model is built, it can be simulated efficiently and fit electrophysiological and neuroimaging data. Moreover, the circuit behavior of multiple model variants can be investigated to aid in distinguishing between competing hypotheses. We employ ModelingToolkit.jl to describe the dynamical behavior of blox as symbolic (stochastic/delay) differential equations. Our libraries of modular blox consist of individual neurons (Hodgkin-Huxley, IF, QIF, LIF, etc.), neural mass models (Jansen-Rit, Wilson-Cowan, Lauter-Breakspear, Next Generation, microcanonical circuits etc.) and biomimetically-constrained control circuit elements. A GUI designed to be intuitive to neuroscientists allows researchers to build models that automatically generate high-performance systems of numerical ordinary/stochastic differential equations from which one can run stimulations with parameters fit to experimental data. Our benchmarks show that the increase in speed for simulation often exceeds a factor of 100 as compared to neural mass model implementation by the Virtual Brain (python) and similar packages in MATLAB. For parameter fitting of brain circuit dynamical models, we use Turing.jl to perform probabilistic modeling, including Hamilton-Monte-Carlo sampling and Automated Differentiation Variational Inference.
This talk will demonstrate Neuroblox.jl by building small dynamic brain circuits. We construct circuits by first defining the nodes (“blox”) and then either defining an adjacency matrix, building directed graphs, or sketching the circuit in our GUI to assemble the brain circuit ODE systems. Using symbolic equations through ModelingToolkit.jl allows us to define additional parameters of the system that are varied across the system, which later can be used for parameter fitting of data. We will also demonstrate the simulation of networks of several hundred spiking neurons representing a biomimetic reinforcement learning model. We present visual patterns to these spiking neuron networks and apply a learning algorithm to adjust the weights to classify the different patterns. The reinforcement learning feedback is implemented by frequent callbacks into the ODE system and allows us to monitor the learning rate continuously. In addition, we will show examples of a larger brain circuit generating an emergent systems-level cortico-striatal-thalamo-cortical loop, which selectively responds to learned visual patterns and matches many of the experimental data's dynamic behaviors from non-human primates. Finally, we will show benchmarks comparing Neuroblox.jl to other implementations of brain circuit simulations. We hope to convince the audience that Julia is the ideal language for computational neuroscience applications.
An update on the ITensor ecosystem 07-26, 15:30–16:00 (US/Eastern), 32-G449 (Kiva)
ITensor is a library for running and developing tensor network algorithms, a set of algorithms where high order tensors are represented as a network of lower order, and low rank, tensors. I will give an update on the ITensor Julia ecosystem. In particular, I will discuss efforts to support more GPU backends and block sparse operations on GPU, as well as ITensorNetworks.jl, a new library for tensor network algorithms on general graphs.
ITensor is a library for running and developing tensor network algorithms, a set of algorithms where high order tensors are represented as a network of lower order, and low rank, tensors. I will give an update on the ITensor Julia ecosystem. In particular, I will discuss updates to our tensor operation backend library, NDTensors.jl, and efforts to make it more extensible and support dense and block sparse operations on a variety of GPU backends through package extensions. In addition, I will discuss the new ITensorNetworks.jl library, a library built on top of ITensors.jl for tensor network algorithms on general graphs, which has a graph-like interface and graph operations based on Graphs.jl.
Matthew Fishman
Research Scientist and Software Engineer at the Flatiron Institute.
Thoughts for the Next Generation 07-26, 16:00–16:30 (US/Eastern), 32-D463 (Star)
The first version of Modelica was released in 1997. Today, over 25 years later, it is still going strong. At the same time, Julia is taking the numerical computing world by storm and ModelingToolkit is revisiting many of the topics and applications that drove the development of Modelica. This talk will highlight many of the important aspects of Modelica's design in the hope that these are taken to heart by the developers of the next generation of modeling and simulation tools.
Modelica is an interesting mix of basically three different topics. At the lowest level, Modelica is concerned with simulation (solving non-linear equations, integrating differential equations, computing Jacobians). In this way, it is very much aligned with Julia and my sense is that Julia has already exceeded what Modelica tools can offer at this level.
The next level above that is the ability to perform symbolic manipulation on the underlying equations so as to generate very efficient simulation code. The capabilities in this layer are what pushed Modelica forward into applications like real-time hardware-in-the-loop because symbolic manipulation provides enormous benefits here that other tools, lacking symbolic manipulation, could not match. There were some amazing contributions from Pantelides, Elmqvist, Otter, Cellier, and many others in this area and it is important that people understand these advances.
Finally, you have the language itself. While the symbolic manipulation opened many doors to extend what was possible with modeling and simulation, the language that was layered on top addressed very important ideas around usability. It brought in many important ideas from software engineering to make modeling and simulation a scalable enterprise (allowing model developers to create models that had potentially 1000s of components and tens or even hundreds of thousands of equations). But it also brought usability with a representation that quite naturally and seamlessly allowed graphical, schematic based modeling to be implemented consistently by a number of different tools.
None of this is to say that Modelica is perfect or that better tools aren't possible. It is simply to raise awareness of all the things Modelica got right so that future generations can build on that and create even more amazing and capable tools to push the frontiers of modeling and simulation even further.
Michael Tiller
Michael Tiller has a Ph.D. in Mechanical Engineering from the University of Illinois, Urbana-Champaign. He is the Secretary of the Modelica Association, President of the North America Modelica Users' Group, author of two books on Modelica and the CTO of Realis Simulation.
In this talk, I will introduce the algorithms used to find optimal contraction orders for tensor networks, which are implemented in the OMEinsumContractionOrders.jl package. These algorithms have a wide range of applications, including simulating quantum circuits, solving inference problems, and solving combinatorial optimization problems.
In this talk, I will introduce the algorithms used to find optimal contraction orders for tensor networks, which are implemented in the OMEinsumContractionOrders.jl package. These algorithms have a wide range of applications, including simulating quantum circuits, solving inference problems, and solving combinatorial optimization problems.
Expronicon: a modern toolkit for meta-programming in Julia 07-26, 16:10–16:20 (US/Eastern), 32-141
Expronicon is a toolkit for metaprogramming in Julia, offering a rich set of functions for analyzing, transforming, and generating Julia expressions, first-class support of MLStyle's pattern matching, and type-stable algebra data types for efficient and simple code generation. Perfect for boosting productivity and improving coding efficiency.
Expronicon is a collective toolkit built on top of the awesome MLStyle package for functional programming features. It features
a rich set of tools for analyzing, transforming, and generating Julia expressions type-stable algebra data type expanding compile-time dependencies, help you get rid of dependencies that are not needed in runtime to improve your package's loading time ExproniconLite: a light-weight version with 0 dependencies for latency-sensitive applications
Graduate student at University of Waterloo and Perimeter Institute. I’m interested in exploring quantum many body physics with machine learning and modern methods of programming. I'm also the Wittek Quantum Prize Winner in 2020 .
One of the creators of QuantumBFS/Yao.jl and many other open source packages in JuliaLang. Contributor of various projects including FluxML/Zygote.jl, FluxML/Flux.jl, PyTorch.
Core member of JuliaCN, the Julia language localization organization for Chinese. This speaker also appears in:
Yao.jl & Bloqade.jl: towards a symbolic engine for quantum
Automatic differentiation (AD) is great: use gradients to optimize, sample faster, or just for fun! But what about coin flips? Agent-based models? Nope, these aren’t differentiable... or are they? StochasticAD.jl is an open-source research package for AD of stochastic programs, implementing AD algorithms for handling programs that can contain discrete randomness.
StochasticAD.jl is an open-source research package for automatic differentiation (AD) of stochastic programs. The particular focus is on implementing AD algorithms for handling programs that can contain discrete randomness. But what does this even mean?
Derivatives are all about how functions are affected by a tiny change ε in their input. For example, take the function sin(x). Perturb x by ε, and the output changes by approximately cos(x) * ε: tiny change in, tiny change out. And the coefficient cos(x)? That's the derivative!
But what happens if your function is discrete and random? For example, take a Bernoulli variable, with probability p of being 1 and probability 1-p of being 0. If we perturb p by ε, the output of the Bernoulli variable cannot change by a tiny amount. But in the probabilistic world, there is another way to change by a tiny amount on average: jump by a large amount, with tiny probability.
StochasticAD.jl generalizes the well-known concept of dual numbers by including a third component to describe large perturbations with infinitesimal probability. The resulting object is called a stochastic triple, and StochasticAD.jl develops the algorithms to propagate this triple through user-written code involving discrete randomness. Ultimately, the result is a provably unbiased estimate of the derivative of your program, even if it contains discrete randomness!
In this talk, we will discuss the workings of StochasticAD.jl, including the underlying theory and the technical implementation challenges.
Frank Schäfer
I am a postdoc in the Julia Lab at the Massachusetts Institute of Technology (MIT). Previously: PhD in physics in the Bruder group within the “Quantum Computing and Quantum Technology” PhD school at the University of Basel.
no subject
https://pretalx.com/juliacon2023/talk/L9S3HT/
Neuroblox.jl: biomimetic modeling of neural control circuits
07-26, 15:00–15:30 (US/Eastern), 32-D463 (Star)
Neuroblox.jl is a Julia module designed for computational neuroscience and psychiatry applications. Our tools range from control circuit system identification to brain circuit simulations bridging scales from spiking neurons to fMRI-derived circuits, parameter-fitting models to neuroimaging data, interactions between the brain and other physiological systems, experimental optimization, and scientific machine learning.
Neuroblox.jl is based on a library of modular computational building blocks (“blox”) in the form of systems of symbolic dynamic differential equations that can be combined to describe large-scale brain dynamics. Once a model is built, it can be simulated efficiently and fit electrophysiological and neuroimaging data. Moreover, the circuit behavior of multiple model variants can be investigated to aid in distinguishing between competing hypotheses.
We employ ModelingToolkit.jl to describe the dynamical behavior of blox as symbolic (stochastic/delay) differential equations. Our libraries of modular blox consist of individual neurons (Hodgkin-Huxley, IF, QIF, LIF, etc.), neural mass models (Jansen-Rit, Wilson-Cowan, Lauter-Breakspear, Next Generation, microcanonical circuits etc.) and biomimetically-constrained control circuit elements. A GUI designed to be intuitive to neuroscientists allows researchers to build models that automatically generate high-performance systems of numerical ordinary/stochastic differential equations from which one can run stimulations with parameters fit to experimental data. Our benchmarks show that the increase in speed for simulation often exceeds a factor of 100 as compared to neural mass model implementation by the Virtual Brain (python) and similar packages in MATLAB. For parameter fitting of brain circuit dynamical models, we use Turing.jl to perform probabilistic modeling, including Hamilton-Monte-Carlo sampling and Automated Differentiation Variational Inference.
This talk will demonstrate Neuroblox.jl by building small dynamic brain circuits. We construct circuits by first defining the nodes (“blox”) and then either defining an adjacency matrix, building directed graphs, or sketching the circuit in our GUI to assemble the brain circuit ODE systems. Using symbolic equations through ModelingToolkit.jl allows us to define additional parameters of the system that are varied across the system, which later can be used for parameter fitting of data. We will also demonstrate the simulation of networks of several hundred spiking neurons representing a biomimetic reinforcement learning model. We present visual patterns to these spiking neuron networks and apply a learning algorithm to adjust the weights to classify the different patterns. The reinforcement learning feedback is implemented by frequent callbacks into the ODE system and allows us to monitor the learning rate continuously. In addition, we will show examples of a larger brain circuit generating an emergent systems-level cortico-striatal-thalamo-cortical loop, which selectively responds to learned visual patterns and matches many of the experimental data's dynamic behaviors from non-human primates. Finally, we will show benchmarks comparing Neuroblox.jl to other implementations of brain circuit simulations. We hope to convince the audience that Julia is the ideal language for computational neuroscience applications.
Helmut Strey
https://pretalx.com/juliacon2023/talk/TXKXKP/
An update on the ITensor ecosystem
07-26, 15:30–16:00 (US/Eastern), 32-G449 (Kiva)
ITensor is a library for running and developing tensor network algorithms, a set of algorithms where high order tensors are represented as a network of lower order, and low rank, tensors. I will give an update on the ITensor Julia ecosystem. In particular, I will discuss efforts to support more GPU backends and block sparse operations on GPU, as well as ITensorNetworks.jl, a new library for tensor network algorithms on general graphs.
ITensor is a library for running and developing tensor network algorithms, a set of algorithms where high order tensors are represented as a network of lower order, and low rank, tensors. I will give an update on the ITensor Julia ecosystem. In particular, I will discuss updates to our tensor operation backend library, NDTensors.jl, and efforts to make it more extensible and support dense and block sparse operations on a variety of GPU backends through package extensions. In addition, I will discuss the new ITensorNetworks.jl library, a library built on top of ITensors.jl for tensor network algorithms on general graphs, which has a graph-like interface and graph operations based on Graphs.jl.
Matthew Fishman
Research Scientist and Software Engineer at the Flatiron Institute.
https://pretalx.com/juliacon2023/talk/HUCMKV/
Thoughts for the Next Generation
07-26, 16:00–16:30 (US/Eastern), 32-D463 (Star)
The first version of Modelica was released in 1997. Today, over 25 years later, it is still going strong. At the same time, Julia is taking the numerical computing world by storm and ModelingToolkit is revisiting many of the topics and applications that drove the development of Modelica. This talk will highlight many of the important aspects of Modelica's design in the hope that these are taken to heart by the developers of the next generation of modeling and simulation tools.
Modelica is an interesting mix of basically three different topics. At the lowest level, Modelica is concerned with simulation (solving non-linear equations, integrating differential equations, computing Jacobians). In this way, it is very much aligned with Julia and my sense is that Julia has already exceeded what Modelica tools can offer at this level.
The next level above that is the ability to perform symbolic manipulation on the underlying equations so as to generate very efficient simulation code. The capabilities in this layer are what pushed Modelica forward into applications like real-time hardware-in-the-loop because symbolic manipulation provides enormous benefits here that other tools, lacking symbolic manipulation, could not match. There were some amazing contributions from Pantelides, Elmqvist, Otter, Cellier, and many others in this area and it is important that people understand these advances.
Finally, you have the language itself. While the symbolic manipulation opened many doors to extend what was possible with modeling and simulation, the language that was layered on top addressed very important ideas around usability. It brought in many important ideas from software engineering to make modeling and simulation a scalable enterprise (allowing model developers to create models that had potentially 1000s of components and tens or even hundreds of thousands of equations). But it also brought usability with a representation that quite naturally and seamlessly allowed graphical, schematic based modeling to be implemented consistently by a number of different tools.
None of this is to say that Modelica is perfect or that better tools aren't possible. It is simply to raise awareness of all the things Modelica got right so that future generations can build on that and create even more amazing and capable tools to push the frontiers of modeling and simulation even further.
Michael Tiller
Michael Tiller has a Ph.D. in Mechanical Engineering from the University of Illinois, Urbana-Champaign. He is the Secretary of the Modelica Association, President of the North America Modelica Users' Group, author of two books on Modelica and the CTO of Realis Simulation.
https://pretalx.com/juliacon2023/talk/PX99DC/
Tensor network contraction order optimization algorithms
07-26, 16:15–16:30 (US/Eastern), 32-G449 (Kiva)
In this talk, I will introduce the algorithms used to find optimal
contraction orders for tensor networks, which are implemented in the
OMEinsumContractionOrders.jl package. These algorithms have a wide range of
applications, including simulating quantum circuits, solving inference
problems, and solving combinatorial optimization problems.
In this talk, I will introduce the algorithms used to find optimal
contraction orders for tensor networks, which are implemented in the
OMEinsumContractionOrders.jl package. These algorithms have a wide range of
applications, including simulating quantum circuits, solving inference
problems, and solving combinatorial optimization problems.
JinGuo Liu
https://pretalx.com/juliacon2023/talk/CQEE8M/
Expronicon: a modern toolkit for meta-programming in Julia
07-26, 16:10–16:20 (US/Eastern), 32-141
Expronicon is a toolkit for metaprogramming in Julia, offering a rich set of functions for analyzing, transforming, and generating Julia expressions, first-class support of MLStyle's pattern matching, and type-stable algebra data types for efficient and simple code generation. Perfect for boosting productivity and improving coding efficiency.
Expronicon is a collective toolkit built on top of the awesome MLStyle package for functional programming features. It features
a rich set of tools for analyzing, transforming, and generating Julia expressions
type-stable algebra data type
expanding compile-time dependencies, help you get rid of dependencies that are not needed in runtime to improve your package's loading time
ExproniconLite: a light-weight version with 0 dependencies for latency-sensitive applications
please refer to the documentation website for more information https://expronicon.rogerluo.dev/
Xiu-zhe (Roger) Luo
Graduate student at University of Waterloo and Perimeter Institute. I’m interested in exploring quantum many body physics with machine learning and modern methods of programming. I'm also the Wittek Quantum Prize Winner in 2020 .
One of the creators of QuantumBFS/Yao.jl and many other open source packages in JuliaLang. Contributor of various projects including FluxML/Zygote.jl, FluxML/Flux.jl, PyTorch.
Core member of JuliaCN, the Julia language localization organization for Chinese.
This speaker also appears in:
Yao.jl & Bloqade.jl: towards a symbolic engine for quantum
https://pretalx.com/juliacon2023/talk/RRBDAA/
StochasticAD.jl: Differentiating discrete randomness
07-26, 16:30–17:00 (US/Eastern), 32-D463 (Star)
This session's header image
Automatic differentiation (AD) is great: use gradients to optimize, sample faster, or just for fun! But what about coin flips? Agent-based models? Nope, these aren’t differentiable... or are they? StochasticAD.jl is an open-source research package for AD of stochastic programs, implementing AD algorithms for handling programs that can contain discrete randomness.
StochasticAD.jl is an open-source research package for automatic differentiation (AD) of stochastic programs. The particular focus is on implementing AD algorithms for handling programs that can contain discrete randomness. But what does this even mean?
Derivatives are all about how functions are affected by a tiny change ε in their input. For example, take the function sin(x). Perturb x by ε, and the output changes by approximately cos(x) * ε: tiny change in, tiny change out. And the coefficient cos(x)? That's the derivative!
But what happens if your function is discrete and random? For example, take a Bernoulli variable, with probability p of being 1 and probability 1-p of being 0. If we perturb p by ε, the output of the Bernoulli variable cannot change by a tiny amount. But in the probabilistic world, there is another way to change by a tiny amount on average: jump by a large amount, with tiny probability.
StochasticAD.jl generalizes the well-known concept of dual numbers by including a third component to describe large perturbations with infinitesimal probability. The resulting object is called a stochastic triple, and StochasticAD.jl develops the algorithms to propagate this triple through user-written code involving discrete randomness. Ultimately, the result is a provably unbiased estimate of the derivative of your program, even if it contains discrete randomness!
In this talk, we will discuss the workings of StochasticAD.jl, including the underlying theory and the technical implementation challenges.
Frank Schäfer
I am a postdoc in the Julia Lab at the Massachusetts Institute of Technology (MIT).
Previously: PhD in physics in the Bruder group within the “Quantum Computing and Quantum Technology” PhD school at the University of Basel.
https://frankschae.github.io/
This speaker also appears in:
Differentiation of discontinuities in ODEs arising from dosing
Convex Optimization for Quantum Control in Julia