1) 25.12.25

Organizer(s)
Usual Time
Thursday, December 25th 2025 at 12:00
Place
BUILDING 503 (Computer Science) AUDITORIUM
More Details

WHO: Nataly Brukhim, Institute for Advanced Study (IAS) and the Center for Discrete Mathematics and Theoretical Computer Science (DIMACS)

WHEN: Thursday, December 25th 2025 at 12:00

WHERE: BUILDING 503 (Computer Science) AUDITORIUM

 

 

Title: Modern Challenges in Learning Theory

Abstract:

Modern machine learning relies on its ability to generalize from limited data, yet a principled theoretical understanding of generalization remains incomplete. While binary classification is well understood in the classical PAC framework, even its natural extension to multiclass learning is substantially more challenging.

In this talk, I will present recent progress in multiclass learning that characterizes when generalization is possible and how much data is required, resolving a long-standing open problem on extending the Vapnik–Chervonenkis (VC) dimension beyond the binary setting. I will then turn to complementary results on efficient learning via boosting.  We extend boosting theory to multiclass classification, while maintaining computational and statistical efficiency even for unbounded label spaces.

Lastly, I will discuss generalization in sequential learning settings, where a learner interacts with an environment over time. We introduce a new framework that subsumes classically studied settings (bandits and statistical queries) together with a combinatorial parameter that bounds the number of interactions required for learning.

Bio:

Nataly Brukhim is a postdoctoral researcher at the Institute for Advanced Study (IAS) and the Center for Discrete Mathematics and Theoretical Computer Science (DIMACS). She received her Ph.D. in Computer Science from Princeton University, where she was advised by Elad Hazan, and was a student researcher at Google AI Princeton. She earned her M.Sc. and B.Sc. in Computer Science from Tel Aviv University.