The London TDA seminar is a research seminar gathering researchers and practitioners in Topological Data Analysis based in and around London. It takes place four times a year in the School of Mathematical Sciences building at Queen Mary University of London, in Mile End.
Schedule 08.02.2023
The seminar will take place in MB-503 at 11am. After the seminar we will go for lunch at The Curve.
11am: Miguel O’Malley (Wesleyan University)
Title: Alpha magnitude and dimension
Abstract: The notion of dimension is a fundamental characteristic of various spaces of interest, and in particular of subspaces of Euclidean space. Notionally, a line segment should have dimension 1, a disc dimension 2, and so on. More complicated subspaces, such as fractals, present more exotic dimensions which are more challenging to estimate. In this talk, we introduce alpha magnitude, a numerical invariant of a filtered simplicial complex associated to a metric space, and some of its key properties. Inspired by a definition introduced by Govc and Hepworth for persistent magnitude, alpha magnitude is in particular derived from the alpha complex. We further introduce alpha magnitude dimension, an invariant inspired by Meckes' notion of magnitude dimension, as a new measure of dimension. Heuristic observations and a following conjecture regarding a relationship with the Minkowski dimension of compact subspaces of Euclidean space will be shown. As alpha magnitude leverages the low computational cost of the alpha complex, we propose its use as a promising method for the estimation of dimensions of real-world data sets that is easily computable."
11.30am: Antonio Leitao (CENTAI)
Title: Measuring topological expressive power of neural network architectures
Abstract: This talk will explore how topological data analysis can be used to understand the expressive power of neural networks. I will show how the topological features of the decision boundary describe the closest notion of the intrinsic complexity of a classification problem, thus introducing the concept of topological expressive power as the number of different topological classes that a neural network architecture can represent. The results show that topological expressiveness has a complex correlation with many features in a neural network's architecture while depending weakly on the total number of parameters.
If you are interested in being added to the mailing list for this seminar, please contact Nina Otter at n dot last-name @ qmul.ac.uk.