Applied and Computational Mathematics Division, ITL, NIST
Tuesday, March 3, 2020, 3:00–4:00 PM
Building 101, Lecture Room B
Tuesday, March 3, 2020, 1:00–2:00 PM
Building 1, Room 3304
This talk will be broadcast on-line using BlueJeans. Contact acmdseminar [at] nist.gov (acmdseminar[at]nist[dot]gov) for details.
Abstract: A number of engineering and scientific computing problems are interested in manipulating input parameters to study a scalar-valued output from a model---e.g., drag over a parameterized airfoil shape. We may subsequently desire to perform integration, optimization, or function approximation of the output to better understand questions related to uncertainty quantification, design, or sensitivity analysis. Unfortunately, it is often the case that methods addressing these questions scale poorly with the input-parameter dimension---our motivation is to make problems more tractable by reducing the input dimension. Ridge approximations constitute a class of function approximations which utilize dimension reduction over a subspace of parameter values. An intuitive approach for subspace-based dimension reduction is the application of active subspaces. Dimension reduction can also be posed in a non-linear framework by composing with more general smooth immersions---e.g., the second half of compositions from an ``autoencoder’’ popular in the study of neural networks. Despite any non-linear generalization, both perspectives seek to restrict parameter combinations to a reduced-dimension smooth manifold. Consequently, understanding dimension reduction in general lends itself to considerations of Riemannian geometry. In other applications, model outputs can depend explicitly on a non-Euclidean manifold-valued domain---e.g., spaces of shapes or images. Whether implicitly through dimension reduction or explicitly by definition of a domain, the reinterpretation of a model’s domain as a general Riemannian manifold poses interesting questions about extensions of active subspaces over smooth manifolds. We define ordered geodesics as submanifolds of a Riemannian manifold which change a differentiable function more by an analogous globalizing notion of the average. These important submanifolds become the so-called active manifold-geodesics (AMG's). This talk will begin by summarizing active subspaces before discussing manifold extensions with simple examples over the 2-sphere. We’ll conclude with an explicit example approximating AMG’s of induced drag over transonic airfoils using a smooth local section of the principal GL-bundle as a candidate manifold of shapes.
Bio: Dr. Zach Grey graduated from the University Colorado - Boulder (CU) Aerospace Engineering Sciences department. His research began in the Applied Mathematics and Statistics Department at Colorado School of Mines (Mines) where he qualified for the Ph.D. and later transferred to CU to continue working with his research advisor, Prof. Paul Constantine. Prior to CU and Mines, Zach graduated with a B.S. in Aerospace Engineering from Embry-Riddle Aeronautical University and later an M.S. in Aero/Astro Engineering from Purdue. While working toward a degree at Purdue, Zach worked full time at the Rolls-Royce Corporation to eventually become a Robust Design Technical Lead. His research aspires to leveraging dimension reduction for integration, approximation, and optimization in a variety of engineering and scientific disciplines. Zach was recently awarded the NRC Postdoctoral Research Associateship to join the NIST ITL ACMD Mathematical Analysis and Modeling group working with Andrew Dienstfrey at the Boulder campus. During his time at NIST, Zach intends to research the computational implications of applying differential geometry for characterizing artificial intelligence and machine learning approaches to approximation.
Note: Visitors from outside NIST must contact Cathy Graham; (301) 975-3800; at least 24 hours in advance.