2019

May
6
2019
A large deviation method for the quantification of extreme surface gravity waves

We interpret the formation of rogue waves on the surface of deep water using tools from large deviation theory and optimal control. We compute the instantons of the problem, i.e. the most likely realizations leading to extreme surface elevations via the governing nonlinear dynamics. Strikingly, the larger waves closely follow the instanton evolution, with small extra fluctuations. Our results are validated by Montecarlo and by real experimental data in a wave flume accross a wide range of forcing regimes, generalizing the existing theories in the limiting linear and highly-nonlinear cases.

The results are obtained in the one-dimensional set-up of the flume, but the method is general and can be extended to the fully two-dimensional case of the ocean. In principle, the framework is exportable to other nonlinear physical systems, to study the mechanisms underlying the extreme events and assess their risk.

Giovanni Dematteis
Amos Eaton 214 4:00 pm
View event flyer

Apr
22
2019
Statistical Reduced Models and Rigorous Analysis for Uncertainty Quantification of Turbulent Geophysical Flows

Abstract:  The capability of using imperfect statistical reduced-order
models to capture crucial statistics in turbulent flows is investigated.
Much simpler and more tractable block-diagonal models are proposed to
approximate the complex and high-dimensional turbulent flow equations. A
systematic framework of correcting model errors with empirical information
theory is introduced, and optimal model parameters under this unbiased
information measure can be achieved in a training phase before the
prediction...

Amos Eaton 214 4:00 pm
View event flyer

Apr
15
2019
An emerging paradigm in biology: The power of weak binding

Abstract: It is cliché to mimic biological design rules in synthetic materials, yet this is the precise challenge for regenerative medicine, therapies for disease pathologies, and vaccines. To design and engineer solutions to biological dysfunction, it is essential to understand Nature’s design rules for successful function. Today, we have data, amazing data, from advances in super-resolution (spatial and temporal) microscopy, targeted fluorescent signaling, chemical synthesis, and various passive and active probes of living systems. I will introduce two biological systems that rely on transient, short-lived, binding interactions to perform diverse functionalities: the genome in live cells with the requirement of genes to self-organize; and, the mucus barriers covering every organ with the requirement to regulate diffusive transport of foreign species within and to flow in order to clear all trapped insults. Time permitting, I will mention other examples. Each system is explored through feedback between experimental data, data analytics, mechanistic modeling and computation, and visualization of experimental and simulated data. Many collaborators will be acknowledged in the lecture.

2019 Richard C. DiPrima Lecture
Amos Eaton 214 4:00 pm
View event flyer

Apr
9
2019
Neural networks as interacting particle systems: understanding global convergence of parameter optimization dynamics

Abstract:   The performance of neural networks on high-dimensional data distributions suggests that it may be possible to parameterize a representation of a target high-dimensional function with controllably small errors, potentially outperforming standard interpolation methods. We demonstrate, both theoretically and numerically, that this is indeed the case. We map the parameters of a neural network to a system of particles relaxing with an interaction potential determined by the loss function. This mapping gives rise to a deterministic partial differential equation that governs the parameter evolution under gradient descent dynamics. We also show that in the limit that the number of parameters n is large, the landscape of the mean-squared error becomes convex and the representation error in the function scales link n^{-1}. In this limit, we prove a dynamical variant of the universal approximation theorem showing that the optimal representation can be attained by stochastic gradient descent, the algorithm ubiquitously used for parameter optimization in machine learning. This conceptual framework can be leveraged to develop algorithms that accelerate optimization using non-local transport. I will conclude by showing that using neuron birth/death processes in parameter optimization guarantees global convergence and provides a substantial acceleration in practice.

Amos Eaton 214 4:00 pm
View event flyer

Apr
1
2019
Modeling Particle Tracking Experiments that Reveal Intracellular Transport Mechanism

Many of the mechanisms that underlie intracellular transport by molecular motors are well-studied by both experiments and mathematical modeling. This is especially true of cargos that are being transported by individual motors and that are perturbed by certain kinds of external forces. However, our understanding of single-motor dynamics does not yet yield robust predictions for what has been observed at a whole cell scale when multiple opposite-directed motors interact with each other. In this talk, we will explore recent mathematical modeling and Bayesian inferential efforts that are directed at new experiments that are aimed at revealing multi-motor interactions in vitro.

Amos Eaton 214 4:00 pm
View event flyer

Mar
25
2019
Deep Learning with Graph Structured Data: Methods, Theory, and Applications

Abstract: Graphs are universal representations of pairwise relationship. A trending topic in deep learning is to extend the remarkable success of well-established neural network architectures (e.g., CNN and RNN) for Euclidean structured data to irregular domains, including notably, graphs. A proliferation of graph neural networks (e.g., GCN) emerged recently, but the scalability challenge for training and inference persists. The essence of the problem is the prohibitive computational cost of computing a mini-batch, owing to the recursive expansion of neighborhoods. We propose a scalable approach, coined FastGCN, based on neighborhood sampling to reduce the mini-batch computation. FastGCN achieves orders of magnitude improvement in training time, compared with a standard implementation of GCN. Predictions remain comparably accurate. A curious question for this approach is why stochastic gradient descent (SGD) training ever converges. In the second part of this talk, we analyze that the gradient estimator so computed is not unbiased but consistent. We thus extend the standard SGD results for unbiased gradients to consistent gradients and show that their convergence behaviors are similar. These results are important and may spawn new interest in the machine learning community, since in many learning scenarios unbiased estimators may not be efficient to compute, and hence other nonstandard but fast gradient estimators serve as sound alternatives.

Amos Eaton 214 4:00 pm
View event flyer

Mar
18
2019
Solving Inverse Problems on Networks: Graph Cuts, Optimization Landscape, Synchronization-SIAM Presentation

 Abstract: Information retrieval from graphs plays an increasingly important role in data science and machine learning. This talk focuses on two such examples. The first one concerns the graph cuts problem: how to find the optimal k-way graph cuts given an adjacency matrix. We present a convex relaxation of ratio cut and normalized cut, which gives rise to a rigorous theoretical analysis of graph cuts. We derive deterministic bounds of finding the optimal graph cuts via a spectral proximity condition which naturally depends on the intra-cluster and inter-cluster connectivity. Moreover, our theory provides theoretic guarantees for spectral clustering and community detection under stochastic block model...

Amos Eaton 214 4:00 pm
View event flyer

Feb
19
2019
Distributed Optimization and Resource Allocation: Algorithms and the Mirror Relation

Abstract: In this talk, we concern the problems of consensus optimization and resource allocation, and how to solve them in a decentralized manner. These two problems are known to be closely related to empirical risk minimization in machine learning and management problems in operations research, respectively.   By saying “decentralized”, we mean that the tasks are to be completed over a set of networked agents in which each agent is able to communicate with adjacent agents. For both problems, every agent in the network wants to collaboratively minimize a function that involves global information, while only a piece of information is available to each of them. 

Amos Eaton 214 4:00 pm
View event flyer

2018

Nov
26
2018
Spatially compatible meshfree discretization through GMLS and graph theory

Spatially compatible discretization is a term that broadly encapsulates numerical discretizations of PDEs that possess some mimetic property of the continuum operators, such as conservation, maximum principles, H(div)/H(curl) - conformity, or discrete preservation of an exact sequence. The construction of such methods is greatly facilitated by employing the exterior calculus framework and the duality between the boundary operators and the exterior derivative in the generalized Stokes theorem. In a finite element context, such tools form the so-called finite element exterior calculus (FEEC) which have unified mixed finite element theory. Meshfree methods, on the other hand, seek to solve a PDE strictly in terms of point evaluations (0-forms) to facilitate automated geometry discretization and large deformation Lagrangian mechanics...

Amos Eaton 214 4:00 pm
View event flyer

Nov
19
2018
Fractalization and Quantization in Dispersive Systems

The evolution, through spatially periodic linear dispersion, of rough initial data produces fractal, non-differentiable profiles at irrational times and, for asymptotically polynomial dispersion relations, quantized structures at rational times.  Such phenomena have been observed in dispersive wave models, optics, and quantum mechanics, and lead to intriguing connections with exponential sums arising in number theory. Ramifications and recent progress on the analysis, numerics, and extensions to nonlinear wave models, both integrable and non-integrable, will be presented.

Amos Eaton 214 4:00 pm
View event flyer

Nov
12
2018
A Block Coordinate Ascent Algorithm for Mean-Variance Optimization

Abstract:  Risk management in dynamic decision problems is a primary concern in many fields, including financial investment, autonomous driving, and healthcare. The mean-variance function is one of the most widely used objective functions in risk management due to its simplicity and interpretability. Existing algorithms for mean-variance optimization are based on multi-time-scale stochastic approximation, whose learning rate schedules are often hard to tune, and have only asymptotic convergence proof. In this talk, we develop a model-free policy search framework for mean-variance optimization with finite-sample error bound analysis (to local optima). Our starting point is a reformulation of the original mean-variance function with its Legendre-Fenchel dual, from which we propose a stochastic block coordinate ascent policy search algorithm. Both the asymptotic convergence guarantee of the last iteration's solution and the convergence rate of the randomly picked solution are provided.

Amos Eaton 214 4:00 pm
View event flyer

Oct
29
2018
Critical points and their stability in continuous resonance equations

Recently Faou-Germain-Hani introduced a continuous resonance (CR) equation applying systematically a variant of weak turbulence theory approach to a nonlinear Schroedinger equation (NLS). They obtained CR equation as a large box limit of two dimensional NLS in a weakly nonlinear regime. We consider one dimensional version of CR equation and investigate critical points and their stability. For three dimensional CR equation, we argue that the Gaussian is a ground state by a combination of numerical and analytical methods. This is a joint work with Gene Wayne (Boston University).

Amos Eaton 214 4:00 pm
View event flyer

Oct
15
2018
Applications of Renewal Theory to Intracellular Transport

 "Applications of Renewal Theory to Intracellular Transport”
Abstract:
Intracellular transport, especially in axons, consists primarily of different molecular motors moving along microtubules with cargos in tow. Biochemical processes at the nanoscale control the dynamics of the motors. Changes in the behavior of the motors (speed, diffusivity, processivity, etc) can then alter the distribution and dynamics of the population of transported cargos at the scale of several microns. Therefore, an important element in understanding cellular regulation of transport is the interaction between these motor-level and cell-level scales.
In this talk, I will discuss how renewal and renewal-reward processes are used to build multi-scale models of cellular transport, connecting fine-scale biophysical models (typically Markovian) to coarse-scale models that are more relevant both for experimental observations and for understanding transport at the cell level.

Amos Eaton 214 4:00 pm
View event flyer

Oct
9
2018
Symmetry-preserving and positivity-preserving Lagrangian schemes for compressible multi-material fluid flows

Abstract: The Lagrangian method is widely used in many fields for multi-material flow simulations due to its distinguished advantage in capturing material interfaces and free boundary automatically. In applications such as astrophysics and inertial confinement fusion, there are three-dimensional cylindrical-symmetric multi-material problems which are usually simulated by the Lagrangian method in the two-dimensional cylindrical coordinates. For this type of simulation, the critical issues for the schemes include keeping positivity of physically positive variables such as density and internal energy and keeping spherical symmetry in the cylindrical coordinate system if the original physical problem has this symmetry.

Amos Eaton 214 4:00 pm
View event flyer

Oct
1
2018
Learning Graph Inverse Problems with Geometric Neural Networks

Abstract: Inverse Problems on graphs encompass many areas of physics, algorithms and statistics, and are a confluence of powerful methods, ranging from computational harmonic analysis and high-dimensional statistics to statistical physics. Similarly as with inverse problems in signal processing, learning has emerged as an intriguing alternative to regularization and other computationally tractable relaxations, opening up new questions in which high-dimensional optimization, neural networks and data play a prominent role. In this talk, I will argue that several tasks that are ‘geometrically stable’ can be well approximated with Graph Neural Networks, a natural extension of Convolutional Neural Networks on graphs. I will present recent work on supervised community detection, quadratic assignment, neutrino detection and beyond showing the flexibility of GNNs to extend classic algorithms such as Belief Propagation.

Amos Eaton 214 4:00 pm
View event flyer

Sep
17
2018
Mathematical modeling of positioning and size scaling of nuclei in multi-nucleated muscle cells

The nucleus is the organizing center of a cell. We use multi-scale modeling to understand how dozens of nuclei in multi-nucleated muscle cells position themselves and adapt their size. Positioning mechanisms involve cytoskeletal fibers, called microtubules,that interact with molecular motors to create forces. We perform large scale computational force screens with hundreds of coarse models to predict nuclear positions...

Amos Eaton 214 4:00 pm
View event flyer