Found 2 result(s)
Regular Seminar Gary Shiu (Wisconsin U., Madison)
at: 15:00 room zoom abstract: | Abstract: We are faced with an explosion of data in many areas of physics, but very so often, it is not the size but the complexity of the data that makes extracting physics from big datasets challenging. As I will discuss in this talk, data has shape and the shape of data encodes the underlying physics. Persistent homology is a tool in computational topology developed for quantifying the shape of data. I will discuss three applications of topological data analysis: 1) identifying structure of the string landscape, 2) constraining cosmological parameters from CMB measurements and large scale structures data, and 3) detecting and classifying phases of matter. Persistent homology condenses these datasets into their most relevant (and interpretable) features, so that simple statistical pipelines are sufficient in these contexts. This suggests that TDA can be used in conjunction with machine learning algorithms and improves their architecture. [for zoom link please contact jung-wook(dot)kim(at)qmul(dot)ac(dot)uk] |
Regular Seminar Gary Shiu (University of Wisconsin)
at: 13:30 room zoom 871 9223 5980 abstract: | We are faced with an explosion of data in many areas of physics, but very so often, it is not the size but the complexity of the data that makes extracting physics from big datasets challenging. As I will discuss in this talk, data has shape and the shape of data encodes the underlying physics. Persistent homology is a tool in computational topology developed for quantifying the shape of data. I will discuss three applications of topological data analysis: 1) identifying structure of the string landscape, 2) constraining primordial non-Gaussianity from CMB measurements and large scale structures data, and 3) detecting and classifying phases of matter. Persistent homology condenses these datasets into their most relevant (and interpretable) features, so that simple statistical pipelines are sufficient in these contexts. This suggests that TDA can be used in conjunction with machine learning algorithms and improves their architecture. Based on https://arxiv.org/abs/2009.14231, https://arxiv.org/abs/2009.04819, https://arxiv.org/abs/1907.10072, https://arxiv.org/abs/1812.06960, https://arxiv.org/abs/1712.08159. [please email a.held@imperial.ac.uk for zoom link or password] |