SMS scnews item created by Catherine Meister at Wed 3 Dec 2025 1346
Type: Seminar
Modified: Wed 3 Dec 2025 1453; Wed 3 Dec 2025 1459; Wed 3 Dec 2025 1548
Distribution: World
Expiry: 31 Dec 2025
Calendar1: 10 Dec 2025 1730-1930
CalLoc1: Eastern Avenue Auditorium and Theatre (F19.03.315)
CalTitle1: Gradient optimization methods: the benefits of instability
Auth: cmeister@staff-10-48-21-70.vpnuser.sydney.edu.au (cmei0631) in SMS-SAML

Special Seminar: Bartlett

Special Seminar, 'Gradient optimization methods: the benefits of instability' (Part of Mathematical Science of AI Safety Focus Period)

Speaker: Peter Bartlett, UC Berkeley

Date & time: Wednesday December 10, lecture from 5:30 – 6:30 pm AEDT, with the opportunity for post-lecture discussion from 6:30 – 7:30 pm (with light refreshments)m

Location: Eastern Avenue Auditorium and Theatre (F19.03.315)

Abstract: Deep learning, the technology underlying the recent progress in AI, has revealed some major surprises from the perspective of theory.  These methods seem to achieve their outstanding performance through different mechanisms from those of classical learning theory, mathematical statistics, and optimization theory.  Optimization in deep learning relies on simple gradient descent algorithms that are traditionally viewed as a time discretization of gradient flow. However, in practice, large step sizes - large enough to cause oscillation of the loss - exhibit performance advantages. This talk will review recent results on gradient descent with logistic loss with a step size large enough that the optimization trajectory is at the "edge of stability." We show the benefits of this initial oscillatory phase for linear functions and for multi-layer networks.

Based on joint work with Pierre Marion, Matus Telgarsky, Jingfeng Wu, and Bin Yu.

About the speaker: Peter Bartlett is Professor of the Graduate School in Statistics and Computer Science at UC Berkeley and Principal Scientist at Google DeepMind. At Berkeley, he is the Machine Learning Research Director at the Simons Institute for the Theory of Computing, Director of the Foundations of Data Science Institute, and Director of the Collaboration on the Theoretical Foundations of Deep Learning, and he has served as Associate Director of the Simons Institute. He is President of the Association for Computational Learning and co-author with Martin Anthony of the book Neural Network Learning: Theoretical Foundations. He was awarded the Malcolm McIntosh Prize for Physical Scientist of the Year, and has been an Institute of Mathematical Statistics Medallion Lecturer, an IMS Fellow and Australian Laureate Fellow, a Fellow of the ACM, a recipient of the UC Berkeley Chancellor's Distinguished Service Award, and a Fellow of the Australian Academy of Science.


Actions:
ball Calendar (ICS file) download, for import into your favourite calendar application
ball UNCLUTTER for printing
ball AUTHENTICATE to mark the scnews item as read
School members may try to .