top of page

Physics of Learning Collaboration

Harnessing the Fundamental Sciences to Break AI Out of Its Black Box
inject-online-programs-service_1723995646572.png

 Our mission 

Throughout the history of machine learning, physicists have made groundbreaking contributions: the Hopfield Model, the Boltzmann Machine, and simulated annealing are just a few of the many ideas which laid the foundations for today's deep learning and artificial intelligence.

​

Progress in ML and AI continues to accelerate, but to date it is largely empirical.  Today's AI is a black box.  We believe the time has come for a concerted effort to discover the principles which make it work.  We will draw on physics, computer science, neuroscience, mathematics, and statistics, to understand the structure of data, the representations and architectures which encode it, and the dynamics of learning.  With this solid foundation, we will unlock the mysteries of scaling laws and reasoning, and explore the potential and limitations of AI in new ways.

​

Our goal is to identify and develop fundamental principles of learning, and break AI out of its black box.  We aspire to open a new era of scientific inquiry into the nature of intelligence, to the benefit of all branches of science and society.

Meet the team

Survey lectures

Florent

Krzakala

How Do Neural Networks Learn Simple Functions with Gradient Descent?

Feb 13, 2025

In this talk, I will review the mechanisms by which two-layer neural networks can learn simple high-dimensional functions from data over time. We will focus on the intricate interplay between algorithms, iterations, and the complexity of tasks at hand, and how gradient descent and stochastic gradient descent can learn features of the function, and improve generalization over random initialization and kernels. I will also illustrate how ideas and methods at the intersection of high-dimensional probability and statistical physics provide fresh perspectives on these questions.

bottom of page