Research Topics

String Theory

Formal developments and implications for particle physics and cosmology, with applications of geometry and ML.

Highlights:
  • A Quadrillion Standard Models from F-theory

    An enormous number of globally consistent F-theory compactifications are constructed that match the Standard Model’s chiral spectrum and achieve gauge coupling unification. All consistency conditions reduce to a single criterion on the base of elliptically fibered Calabi-Yau fourfolds, yielding at least 10^15 distinct models for toric bases.

  • Branes with Brains: Exploring String Vacua with Deep RL

    Deep reinforcement learning is used to navigate the string theory landscape, with an AI agent exploring type IIA compactifications with intersecting D6-branes. The agent outperforms random searches by finding up to 200 times more Standard Model-like solutions, discovering both known human strategies and novel approaches for solving the underlying Diophantine equations.

  • Matter From Geometric Deformations and Singular Spaces

    Deformation theory of algebraic singularities is used to study charged matter in M-theory and F-theory compactifications on elliptically fibered Calabi–Yau manifolds, reproducing ADE representation structures and non-simply-laced algebras from string junctions. The localized 6D charged matter spectrum is determined directly on singular elliptic Calabi–Yau threefolds by counting massless string junctions modulo SL(2,Z) monodromy, with results matching 6D anomaly cancellation predictions.

  • Machine Learning in the String Landscape

    Machine learning techniques are applied to analyze the vast string landscape, including decision trees for predicting F-theory compactifications and regression models for conjecturing gauge group ranks. New conjectures are generated and proven, such as when E6 symmetry emerges in large ensembles of compactifications.

NN-FT

A correspondence between neural networks and field theory.

Recent Highlights:
  • Universality of NN-FT and Liouville

    Any quantum field theory or probability distribution over tempered distributions can be described by a neural network with infinitely many parameters. As an example, the 2D Liouville theory is realized as a neural network, with numerical computations of vertex operator three-point functions matching the DOZZ formula.

  • String Theory from Infinite Width Neural Networks

    Bosonic string theory is realized using ensembles of infinite-width neural networks, where string tension is controlled by output weight variance. This approach provides a new derivation of key amplitudes like the Virasoro-Shapiro and Veneziano amplitudes as neural network correlators. The construction offers a computational framework for string theory phenomena.

  • Fermions and Supersymmetry in Neural Network Field Theories

    Fermionic neural network field theories are introduced using Grassmann-valued neural networks, with the central limit theorem generalized to Grassmann variables. Free Dirac spinors are realized at infinite width, four-fermion interactions at finite width, and Yukawa-type couplings via correlated parameter distributions. Constructions realizing supersymmetric quantum mechanics and field theories are presented through super-affine transformations on inputs.

ML for Physics & Math

Applications of machine learning techniques to solve problems in theoretical physics and mathematics, outside of string theory.

Highlights:
  • Learning to Unknot and Searching for Ribbons

    Transformers and reinforcement learning are applied to the UNKNOT decision problem using braid-word representations, with RL agents discovering simplification strategies via Markov moves and braid relations. Bayesian optimization and reinforcement learning are used to find ribbon disks bounded by knots, aiding efforts to disprove the 4D smooth Poincaré conjecture by identifying ribbon knots up to 70 crossings.

  • On the Generality and Persistence of Cosmological Stasis

    Hierarchical decays of matter species can lead to ‘stasis,’ a phase where abundances remain constant against Hubble expansion. Machine learning analyzes this in high-dimensional parameter spaces, revealing exponential models that achieve inflation-like e-folds with fewer species than power-law alternatives. The findings support stasis as a generic cosmological phase with implications for string theory and the axiverse.

  • Rigor with Machine Learning from Field Theory to the Poincaré Conjecture

    A discussion of how machine learning can be used rigorously in mathematics and physics, from generating and verifying conjectures to applications in string theory and the smooth 4D Poincaré conjecture, with new connections between neural networks, field theory, and geometric flows.

Physics & Math for ML

Insights from physics and mathematics to advance machine learning theory and algorithms.

Highlights:
  • Kolmogorov-Arnold Networks

    Kolmogorov-Arnold Networks (KANs) are a new neural network architecture that replaces traditional neuron weights with learnable univariate functions. KANs achieve strong performance on various tasks with improved interpretability, and are effective in scientific machine learning including solving PDEs and discovering physical laws.

  • Quantum Mechanics and Neural Networks

    Any Euclidean quantum mechanical theory can be represented as a neural network, with reflection positivity ensuring unitarity via mechanisms like parameter splitting or Markov properties. Non-differentiability relates to non-commuting operators, and deep networks on Markov processes enable complex quantum systems. Numerical examples recover Heisenberg uncertainty, commutators, and energy spectra.

  • TASI Lectures on Physics for Machine Learning

    Lectures covering neural network theory from a physics perspective, focusing on expressivity, statistics, and dynamics. Classic results like universal approximation and Gaussian process equivalences are discussed alongside modern topics such as neural tangent kernels and Kolmogorov-Arnold networks.