(Tenative Schedule)

Time & Location: All talks are on Thursdays in Gibson Hall 126 at 3:30 pm unless otherwise noted. Refreshments in Gibson 426 after the talk.

Comments indicating vacations, special lectures, or change in location or time are in

Organizer: Gustavo Didier

**Abstract**:

The height distributions of critical points of random fields arise from p-value computations when performing hypotheses tests at critical points such as local maxima. In this talk, we will show the formulae for the height distributions of critical points of smooth isotropic Gaussian random fields. The results hold in general in the sense that there are no restrictions on the covariance function of the field except for smoothness and isotropy. The results are based on a characterization of the distribution of the Hessian of the Gaussian field by means of the family of Gaussian orthogonally invariant (GOI) matrices, of which the Gaussian orthogonal ensemble (GOE) is a special case. We then apply the results to a topological multiple testing scheme for detecting peaks in images under stationary ergodic Gaussian noise in Euclidean space, where tests are performed at local maxima of the smoothed observed signals. The developed STEM algorithms, combined with the Benjamini-Hochberg procedure for thresholding p-values, provide asymptotic strong control of the False Discovery Rate (FDR) and power consistency, with specific rates, as the search space and signal strength get large. Simulations show that FDR levels are maintained in non-asymptotic conditions. The methods are illustrated in the analysis of functional magnetic resonance images of the brain. The method of multiple testing of local maxima are also extend to Gaussian random fields on the sphere, providing a powerful tool to detect point sources in CMB data in astronomy. Another important application is detecting change points by performing multiple testing of critical points of the smoothed observed signal. We will also discuss some open problems and future research.

**Abstract**:

One of the outstanding challenges in atomistic simulations of materials is how to reach physically meaningful time scales. While the fundamental time scale of the atomistic models is that of the femtosecond, physically meaningful phenomenon may take microseconds or longer to occur. This precludes a direct numerical simulation with, for instance, a Langevin model of the material from reaching physical time scales. The time scale separation challenge has motivated the development of a variety of multiscale methods, including accelerated molecular dynamics, kinetic Monte Carlo, phase field models, and diffusive molecular dynamics. In this talk, I will survey some of these approaches and discuss common mathematical assumptions that underlie them while also highlighting where approximations have been made. Rigorous results will be presented, where available, along with outstanding mathematical challenges.

**Abstract**:

We propose a new measure for stationarity of a functional time series, which is based on an explicit representation of the L^2-distance between the spectral density operator of a non-stationary process and its best (L^2-)approximation by a spectral density operator corresponding to a stationary process. This distance can easily be estimated by sums of Hilbert-Schmidt inner products of periodogram operators (evaluated at diﬀerent frequencies), and asymptotic normality of an appropriately standardised version of the estimator can be established for the corresponding estimate under the null hypothesis and alternative. As a result we obtain conﬁdence intervals for the discrepancy of the underlying process from a functional stationary process and a simple asymptotic frequency domain level α test (using the quantiles of the normal distribution) for the hypothesis of stationarity of functional time series. Moreover, the new methodology allows also to test precise hypotheses of the form “the functional time series is approximately stationarity”, which means that the new measure of stationarity is smaller than a given threshold. Thus in contrast to methods proposed in the literature our approach also allows to test for “relevant” deviations from stationarity.

We demonstrate in a small simulation study that the new method has very good ﬁnite sample properties and compare it with the currently available alternative procedures. Moreover, we apply our test to annual temperature curves.

**Abstract**:

Connecting dynamic models with data to yield predictive results often requires a variety of parameter estimation, identifiability, and uncertainty quantification techniques. These approaches can help to determine what is possible to estimate from a given model and data set, and help guide new data collection. Here, we will discuss differential algebraic and simulation-based approaches to identifiability analysis, and examine how parameter estimation and disease forecasting are affected when examining disease transmission via multiple types or pathways of transmission. Using examples taken from cholera and polio outbreaks in several settings, we illustrate some of the potential difficulties in estimating the relative contributions of different transmission pathways, and show how alternative data collection may help resolve this unidentifiability.

**Abstract**:

This talk navigates us through the landscape of stochastic filtering, its computational implementations and their applications in science, engineering and national defense. We start by exploring properties of the optimal filtering distribution. Under general conditions, the filtering distribution does not enjoy a closed form solution. Employing several methods, e.g. particle filters, we approximate it and we explore properties of the underlying process and its engaging parameters. The parameter estimation leads us to a research path which involves a novel algorithm of particle filters blended with a Markov Chain Monte Carlo scheme, a sequential Empirical Bayes method and related sufficient estimators. Last, this talk adopts this research path and sheds light on the estimation of the spatiotemporal evolution of radioactive material caused by the disastrous accident at the Fukushima power plant station in 2011.

**Location:** Stanley Thomas 316

**Time:** 3:30 PM

**Abstract**:

**Abstract**:

Power and reproducibility are key to enabling refined scientific discoveries in contemporary big data applications with general high-dimensional nonlinear models. In this paper, we provide theoretical foundations on the power and robustness for the model-X knockoffs procedure introduced recently in Candes, Fan, Janson and Lv (2017) in high-dimensional setting when the covariate distribution is characterized by Gaussian graphical model. We establish that under mild regularity conditions, the power of the oracle knockoffs procedure with known covariate distribution in high-dimensional linear models is asymptotically one as sample size goes to infinity. When moving away from the ideal case, we suggest the modified model-free knockoffs method called graphical nonlinear knockoffs (RANK) to accommodate the unknown covariate distribution. We provide theoretical justifications on the robustness of our modified procedure by showing that the false discovery rate (FDR) is asymptotically controlled at the target level and the power is asymptotically one with the estimated covariate distribution. To the best of our knowledge, this is the first formal theoretical result on the power for the knockoffs procedure. Simulation results demonstrate that compared to existing approaches, our method performs competitively in both FDR control and power. A real data set is analyzed to further assess the performance of the suggested knockoffs procedure. This is a joint work with Emre Demirkaya, Yingying Fan and Gaorong Li.

**Abstract**:

Flow polytopes of graphs is a rich family of polytopes that include the Pitman-Stanley polytope and a face of the polytope of doubly stochastic matrices called the Chan-Robbins-Yuen polytope. Lattice points of polytopes are counted by Kostant's vector partition function from Lie theory. In the early 2000s, Postnikov-Stanley and Baldoni-Vergne gave remarkable formulas for their volume and lattice points using the Elliott-MacMahon algorithm and residue computations respectively.

In this talk we will describe these polytopes, how to subdivide them to obtain these formulas, and a model for the formulas using certain well-known combinatorial objects called parking functions. We will illustrate the subdivision and the model with known and new examples of flow polytopes with surprising volumes.

This is based on joint work with Karola Meszaros and joint work with Carolina Benedetti, Rafael Gonzalez D'Leon, Chris Hanusa, Pamela Harris, Apoorva Khare and Martha Yip.

**Abstract**:

A long standing, fundamental question in biology is "what are the minimal conditions to ensure the long-term persistence of a population, or to ensure the long-term coexistence of interacting species?" The answers to this question are essential for identifying mechanisms that maintain biodiversity and guiding conservation efforts. Mathematical models play an important role in identifying potential mechanisms and, when coupled with empirical work, can determine whether or not a given mechanism is operating in a specific population or community. For over a century, nonlinear difference and differential equations have been used to identify mechanisms for population persistence and species coexistence. These models, however, fail to account for intrinsic and extrinsic random fluctuations experienced by all populations. In this talk, I discuss recent mathematical results about persistence and coexistence for models accounting for demographic and environmental stochasticity.

Demographic stochasticity stems from populations consisting of a finite number of interacting individuals. These dynamics can be represented by Markov chains on a countable state space. For closed populations in a bounded world, extinction in these models occurs in finite time, but may be preceded by long-term transients. Quasi-stationary distributions (QSDs) of these Markov chains characterize this meta-stable behavior. These QSDs correspond to an eigenvector of the transition operator restricted to non-extinction states, and the associated eigenvalue determines the mean time to extinction when the Markov chain is in the quasi-stationary state. I will discuss under what conditions (i) this mean time to extinction increases exponentially with "habitat size" and (ii) the QSDs concentrate on attractors of the mean field model of the Markov chain. These results will be illustrated with models of competing Californian annual plants and chaotic beetles.

On the other hand, environmental stochasticity stems from fluctuations in environmental conditions which influence survival, growth, and reproduction. These effects on population and community dynamics can be modeled by stochastic difference or differential equations. For these models, "stochastic persistence" corresponds to the weak* limit points of the empirical measures of the process placing arbitrarily little weight on arbitrarily low population densities. I will discuss sufficient and necessary conditions for stochastic persistence. These conditions involve Lyapunov exponents corresponding to the "realized" per-capita growth rates of species with respect to stationary distributions supporting subsets of species. These results will be illustrated with models of Bay checkerspot butterflies and eco-evolutionary rock-paper-scissor dynamics.

**Abstract**:

Vector-borne diseases such as Lyme disease, Dengue fever and Zika virus have imposed significant challenges for public health decision support systems. Modern technologies and increasing global interdisciplinary collaborations have promised rich sources of data about vector and host ecology, pathogen epidemiology and environmental conditions, so it is imperative to have fundamental (mathematical and computational) frameworks which integrate data from all different sources in order to provide summative prediction of spatiotemporal patterns of disease spread and evaluation of intervention strategies. Here we use Lyme disease as a case study to show how clinical, laboratory, field observation and surveillance data along with remote sensor and GIS information can be integrated through a structured (hyperbolic or delay differential equation) epidemiological model to produce infection risk maps using the classical Floquet theory. We will also show how to incorporate climate change induced (vector) biological invasion into a typical reaction-diffusion epidemic model, and present some recent results about the spatiotemporal patterns of these reaction diffusion equations in an wave-like environment.

**Abstract**:

The sequence 1, 1, 2, 5, 14, 42, 132, ... of Catalan numbers is perhaps the most ubiquitous integer sequence in mathematics. We will give a survey of these numbers for a general mathematical audience. Topics will include the history of Catalan numbers, some combinatorial interpretations (taken from the 214 interpretations in my monograph on Catalan numbers), some algebraic interpretations, one of the many known generalizations of Catalan numbers, and some connections with number theory and analysis.

Nestor GuillenUniversity of Massachusetts at Amherst (Host: Nathan Glatt-Holtz)

**Abstract**:

In mathematics as well as in physics we are well used to dealing with partial differential equations (PDE): the infinitesimal rates of change for one or more fields are constrained by a pointwise relationship representing a physical law or geometric constraint. However, many problems in physics and mathematics feature strong long range effects which impose constraints on the variation of fields beyond the infinitesimal scale, such constraints are encoded not via PDEs but via integro-differential equations. Such equations go as far back as Leibniz, who first studied the notion of a fractional order derivative. In this talk I will survey the field of integro-differential equations, discussing important examples from statistical mechanics, fluid mechanics, stochastic processes, conformal geometry, and more. I will highlight recent results in the area, and discuss a recent result obtained with Russell Schwab regarding a min-max representation formula for these operators and how it can be applied to free boundary problems.

**Abstract**:

After introducing the flag&Grassmannian varieties, I shall introduce the Schubert varieties, and I will then show several examples of important algebraic varieties which are related to Schubert varieties.

**Abstract**:

I will describe joint work with Stan Alama, Lia Bronsard, Andres Contreras and Jiri Dadok giving criteria for existence and for non-existence of certain isoperimetric planar curves minimizing length with respect to a metric having conformal factor that is degenerate at two points, such that the curve encloses a specified amount of Euclidean area. These curves, appropriately parametrized, emerge as traveling waves for a bi-stable Hamiltonian system that can be viewed as a conservative model for phase transitions.

Mathematics Department, 424 Gibson Hall, New Orleans, LA 70118 504-865-5727 math@math.tulane.edu