Nonlinear Algebra Seminar
December 12, 1:10 - 5 pm
Evans 740, UC Berkeley
Schedule
1:10 - 1:40 | Ruriko Yoshida |
1:40 - 2:10 | Marco David |
2:10 - 2:40 | Svala Sverrisdóttir |
2:40 - 2:50 | Break |
2:50 - 3:20 | Maksym Zubkov |
3:20 - 3:50 | Kristen Dawson |
3:50 - 4:00 | Break |
4:00 - 5:00 | Dmitriy Morozov |
Titles and Abstracts
Speaker:
Ruriko Yoshida (Naval Postgradaute School)
Title:
Tropical Fermat-Weber Polytropes
Abstract:
In this talk we discuss the geometry of tropical Fermat-Weber points in terms of the symmetric tropical metric over the tropical projective torus. It is well-known that a tropical Fermat-Weber point of a given sample is not unique and we show that the set of all possible Fermat-Weber points forms a polytrope. To prove this, we show that the tropical Fermat-Weber polytrope is a bounded cell of a tropical hyperplane arrangement given by both max- and min-tropical hyperplanes with apices given by the sample. We also define tropical Fermat-Weber gradients and provide a gradient descent algorithm that converges to the Fermat-Weber polytrope. This is joint work with J. Sabol, D. Barnhill and K. Miura.
Speaker:
Marco David (UC Berkeley)
Title:
Symplectic Learning for Hamiltonian Neural Networks
Abstract:
Machine learning methods are widely used to model and predict physical systems from observation data. Yet, they are often used as poorly understood “black boxes,” disregarding existing mathematical structure and invariants of the problem. The proposal of Hamiltonian Neural Networks (HNNs) from 2019 takes a first step towards a “gray box” approach, using physical insight to improve performance. Here, we explore a significantly improved training method for HNNs, exploiting the symplectic structure of Hamiltonian systems. This frees the loss from an artificial lower bound. Moreover, we are able to prove detailed analytic bounds on the training errors of HNNs which, in turn, renders them explainable. Finally, we present a post-training correction to obtain the true Hamiltonian only from discretized observation data, up to an arbitrary order.
Speaker:
Svala Sverissdóttir (UC Berkeley)
Title:
Gram Matrices for Isotropic Vectors
Abstract:
We investigate determinantal varieties
for symmetric matrices that have
zero blocks along the main diagonal.
In theoretical physics, these arise as Gram matrices for
kinematic variables in quantum field theories.
We study the ideals of relations among
functions in the matrix entries
that serve as building blocks for conformal correlators.
Speaker:
Maksym Zubkov (University of British Columbia)
Title:
The Geometry of Rational Neural Networks
Abstract:
Rational neural networks are feedforward neural networks with a rational activation function. These networks found their applications in approximating the solutions of PDE, as they are able to learn the poles of meromorphic functions. In this talk, we are going to consider the simplest rational activation function, sigma = 1 / x, and study the geometry of family such architectures. We will show that the closure of all possible shallow (one hidden layer) networks is an algebraic variety, which called a neurovariety.
Speaker:
Kristen Dawson (San Francisco State Univeristy)
Title:
Positive Semidefinite Matrix Factorizations
Abstract:
A positive semidefinite (psd) factorization of a nonnegative matrix M expresses each entry of M as the inner product of two psd matrices. These factorizations correspond to spectrahedral lifts of a polytope associated with M. The aim of the talk is to characterize the uniqueness of a psd factorization of a matrix of rank 3 using psd matrices of size 2. The characterization is obtained using tools from rigidity theory.
Speaker:
Dmitrity Morozov (Lawrence Berkeley National Laboratory)
Title:
Recent Advances in Topological Data Analysis
Abstract: