The first machine learning seminar this semester is given by Dr. Sergey Dolgov (University of Bath) who is currently visiting SMRI. where and when: Carslaw 375, 1pm August 18 (Monday) Title: Low-rank approximations for large-scale nonlinear feedback control Abstract: Computation of the optimal feedback law for general (nonlinear/unstable/stochastic) dynamical systems requires solving the Hamilton-Jacobi-Bellman Partial Differential Equation (PDE), which suffers from the curse of dimensionality. We develop a unified framework for computing a fast surrogate model of the feedback control function based on low-rank decompositions of matrices and tensors. Firstly, we propose a Statistical Proper Orthogonal Decomposition (SPOD) for Model Order Reduction of very high-dimensional systems, such as the discretized Navier-Stokes equation or other PDEs, by compressing snapshots corresponding to random samples of all parameters in the system, initial condition and time. Secondly, we compute a low-rank Functional Tensor Train (TT) approximation of the feedback control function for the reduced model. Thus pre-trained TT representation of the control function of the reduced state can be used for real-time online generation of the control signal. Using the proposed SPOD and TT approximations, we demonstrate a controller computable in milliseconds that achieves lower vorticity of the Navier-Stokes flow with random inflow compared to using the mean inflow to produce reduced bases or controllers. Speaker's short Bio: Sergey Dolgov is currently a Reader (Associate Prof.) at the University of Bath. He earned the PhD in Oct 2024 from the University of Leipzig and the Max Planck Institute for Mathematics in the Sciences, supervised by Boris Khoromskij. Thereafter he was a Postdoc at the Max Planck Institute for Dynamics of Complex Technical Systems in Magdeburg with Peter Benner and Martin Stoll. At Bath since Jan 2016, firstly as a postdoctoral fellow of the Engineering and Physical Sciences Research Council, Lecturer from 2018, and Reader from 2022. His main research interests are tensor approximations and algorithms, applied in computational statistics, uncertainty quantification and feedback control.