Discovering the decision and communication patterns cells execute when arranging into tissues is a reverse-engineering task. In order to see what phenotype and developmental dynamics a given mechanistic hypothesis produces, we need computer simulations because the nonlinearity and complexity of the system are prohibitive of theoretical predictions. We therefore develop a modular numerical simulation paradigm for biophysical and biological processes in 3D space and time.
We address this by novel hybrid particle-mesh methods that we design to meet the intricacies of biological systems (i.e., nonlinearity, internal activity, complex geometries). Particles are used as collocation points to represent continuous fields. A numerical method (DC PSE) developed in our group enables consistent approximation of differential operators on any particle function representation. This provides the freedom of distributing the particles arbitrarily, in particular so as to be adapted to the simulated geometries and their temporal dynamics. Further, the particles self-organize according to adjustable interaction potentials, providing unprecedented flexibility to simulate multiple scales of resolution.
Current work in this project focuses on adopting the Adaptive Particle Representation, originally developed for imaging, for adaptive-resolution numerical simulations with guaranteed error bounds. In addition, we work on Lagrangian methods for simulations on moving and deforming geometries, on solving vector- and tensor-valued equation models of active matter on dynamic 3D surfaces, and on increasing the numerical accuracy of simulations using a novel approach to polynomial interpolation problems.
Main Collaborators:
Our simulations help unravel the molecular mechanisms of tissue formation across scales, from molecules to organism. If the simulated mechanisms are sufficient to bring about the experimentally observed behavior, we can further leverage the power of experimental data to predict minimal necessary models. For this, latest developments in machine learning and data-driven model inference are instrumental.
We do research into two complementary approaches: First, algorithms to infer interpretable and physically consistent mathematical models that are sufficient to describe observed dynamics. Second, deep-learning architectures for equation-free computational forecasting of dynamics from data. In both aspects, we pay special attention to inference robustness (against noise in the data and random variations in the parameters), as we believe this to be a universal design principle of biological systems. The concept of Design Centering with Lp-Adaptation has been instrumental for this, since it appears to resist the Curse of Dimensionality.
We focus on approaches that combine principled knowledge of numerical algorithms with modern machine learning models. This enables us to develop deep-learning architectures with provable mathematical guarantees, and to provide interpretable results. It also allows us to merge data-driven and model-driven approaches, e.g., using machine learning surrogates to accelerate numerical simulations or to use numerical theory to constrain the training of machine-learning methods for guaranteed consistency. We believe that the intersection of machine learning and numerical analysis is going to be a fruitful area of research for the coming decades.
The current application focus is on leveraging the power of machine learning to help understand the role of fluid flow and active cytoskeletal forces in tissue growth and development. In collaboration with experimental groups, we address these questions on the molecular, cellular, and organ levels with applications in personalized medicine and therapy.
Main Collaborators:
In addition to predictive simulations and data-driven models, reverse-engineering the molecular mechanisms of tissue formation also requires direct comparison with microscopy images and a healthy portion of biological and biophysical intuition in order to come up with novel hypotheses. Both tremendously benefit from recent advancements in volumetric microscopy, allowing our collaborators to image developing tissues and organs at cellular resolution in 3D.
This capability, however, also brought with it the challenge of visualizing, handling, and analyzing terabyte-sized image datasets. We are therefore developing computationally efficient, immersive virtual-reality visualization and hand gesture control for 3D microscopy. This system enables users to immerse into the 3D image data, literally “walking around” in a developing tissue, and explore large datasets. When combined with eye-tracking technology, this can also be used to track cells and reconstruct developmental lineages by just following them with the eyes. Our system also natively visualizes images represented as Adaptive Particle Representations, reducing data sizes and processing times by orders of magnitude.
In addition to raw volumetric image data, our immersive VR environment is also developed for visualizing 3D computer simulation results. We aim to do so in real time, i.e., while the simulation is running on a remote supercomputing system, in order to enable the user to interact with the running simulation, e.g., via hand gestures (“computational steering”). In overlay with possibly live microscopy image data, this would enable interactive 3D model identification and validation of computer simulations.
Currently, we focus on also enabling a similar “feedback channel” for microscopy, coupling the VR environment to instrument control interfaces. This will enable users to directly interact with the sample currently under the microscope, e.g., using hand gestures to control a laser cutting or ablation unit, or to perform interactive optogenetic perturbations with human in the loop. Another focus is on user studies to quantify the gain in productivity and scientific intuition afforded by such modalities, and on coupling human-in-the-loop with machine-learning background tasks for AI-augmented and AI-supported experimental workflows.
Main Collaborators:
Performing simulations of nonlinear models in 3D geometries, training deep neural nets over 3D data, and visualizing terabyte-size images in real time all requires parallel high-performance computing. In our case, this includes accelerators like graphics cards (GPUs) and high-performance computer clusters. Combining the two, we develop a generic high-performance software environment that significantly reduces code development times and serves as a common technology platform for our research.
This software environment is the transparent and portable parallel computing framework OpenFPM, the successor to the venerable Parallel Particle Mesh (PPM) library based on the same data and operation abstractions. OpenFPM not only ensures parallel scalability of codes on both CPU and GPU clusters, but reduces code-development times from years to weeks. This improves programmer productivity to a level where we can readily experiment with novel numerical methods and rapidly adapt to new biophysical hypotheses or new data becoming available in our highly dynamic research environment. This computational agility is further supported by a domain-specific programming language for parallel particle-mesh methods and by the integrated OpenPME development environment for scientific computing.
Current work in this project focuses on bringing support for sparse and adaptive-resolution methods to multi-GPU hardware, on performing fundamental research into the mathematical foundations and language meta-models of particle methods, and on further optimizing and improving OpenFPM as well as extending its use to machine learning, e.g., via Python interfaces and integration with deep-learning frameworks. Finally, we perform research into algorithmic auto-tuning for numerical simulations as well as optimizing and parallelizing compilers.
Main Collaborators: