Week 16.10.2021 – 24.10.2021

Monday

Lonti: What is an Anomaly?

Regular Seminar Chris Herzog (KCL)

at:
10:00 KCL
room Youtube
abstract:

Lonti Autumn 2021 Series: Lecture 1. Release of Recorded Lecture. Available here: https://youtu.be/hiUnq_5iiPM. Four examples of an anomaly are presented, two from quantum mechanics and two from quantum field theory. The first example is a charged bead on a wire in the presence of a magnetic field. This example of a 't Hooft anomaly is related to the theta angle in Yang-Mills theory. The remaining three examples present scale and conformal anomalies. We will scatter a plane wave off an attractive delta function in two dimensions. We also look at a massless scalar field, both in two dimensions without a boundary and in three dimensions with one.

Tuesday

New postdoc introduction

Regular Seminar David Tennyson and Morteza Hosseini (Imperial)

at:
1:30 IC
room B630
abstract:

Morteza Hosseini: "Microstates of AdS black holes in string theory." I’ll review recent advances of counting black holes microstates in AdS spacetimes. David Tennyson: "Supersymmetric flux backgrounds of string theory." I will review my work on the geometry of supersymmetric flux backgrounds of string theory through generalised geometry. In particular, I will introduce the exceptional complex structure and discuss some applications.

Wednesday

Renormalization Group Flows on Line Defects

Regular Seminar Avia Raviv-Moshe (Stony Brook U., New York, SCGP)

at:
15:45 KCL
room Online
abstract:

In this talk, we will consider line defects in d-dimensional CFTs. The ambient CFT places nontrivial constraints on renormalization group flows on such line defects. We will see that the flow on line defects is consequently irreversible and furthermore a canonical decreasing entropy function exists. This construction generalizes the g theorem to line defects in arbitrary dimensions. We will demonstrate this generalization in some concrete examples, including a flow between Wilson loops in 4 dimensions, and an O(3) bosonic theory coupled to impurities with large isospin.

Thursday

Matrix bootstrap revisited

Journal Club Zechuan Zheng (ENS Paris)

at:
15:45 Other
room Zoom, instructions in abstract
abstract:

Matrix bootstrap is a new method for the numerical study of (multi)-matrix models in the planar limit, using loop equations for moments of distribution (Ward identities and factorization of traces at infinite N). The lack of information associated with the use of only a finite number of lower moments is supplemented by the conditions of positivity of the correlation matrix. The numerical solution of loop equations and these conditions leads to inequalities for the lowest moments, which rapidly converge to exact values with an increase in the number of used moments. In our work https://arxiv.org/pdf/2108.04830.pdf, the method was tested on the example of the standard one-matrix model, as well as on the case of an ''unsolvable'' 2-matrix model with the interaction tr[A, B]^2 and with quartic potentials. We propose a significant improvement of original H. Lin's proposal for matrix bootstrap by introducing the relaxation procedure: we replace the non-convex, non-linear loop equations by convex inequalities. The results look quite convincing and matrix bootstrap seems to be an interesting alternative to the Monte Carlo method. For example, for < tr A^2 >, the precision reaches 6 digits (with modest computer resources). I will discuss the prospects for applying the method in other, physically interesting systems. --------------------- Part of the London Integrability Journal Club. Please register at integrability-london.weebly.com if you are a new participant. The link will be emailed on Tuesday.

The Principles of Deep Learning Theory

Regular Seminar Dan Roberts (MIT)

at:
14:00 QMW
room zoom
abstract:

[for zoom link please email s.nagy@qmul.ac.uk] Deep learning is an exciting approach to modern artificial intelligence based on artificial neural networks. The goal of this talk is to provide a blueprint — using tools from physics — for theoretically analyzing deep neural networks of practical relevance. This task will encompass both understanding the statistics of initialized deep networks and determining the training dynamics of such an ensemble when learning from data. In terms of their "microscopic" definition, deep neural networks are a flexible set of functions built out of many basic computational blocks called neurons, with many neurons in parallel organized into sequential layers. Borrowing from the effective theory framework, we will develop a perturbative 1/n expansion around the limit of an infinite number of neurons per layer and systematically integrate out the parameters of the network. We will explain how the network simplifies at large width and how the propagation of signals from layer to layer can be understood in terms of a Wilsonian renormalization group flow. This will make manifest that deep networks have a tuning problem, analogous to criticality, that needs to be solved in order to make them useful. Ultimately we will find a "macroscopic" description for wide and deep networks in terms of weakly-interacting statistical models, with the strength of the interactions between the neurons growing with depth-to-width aspect ratio of the network. Time permitting, we will explain how the interactions induce representation learning. This talk is based on a book, "The Principles of Deep Learning Theory," co-authored with Sho Yaida and based on research also in collaboration with Boris Hanin. It will be published next year by Cambridge University Press.