The Class of ’27 Lecture Series is a special lecture held each year. It was established in 1960 to honor the first chair of the Math Sciences Department, Professor Edwin Allen. The three members of the class of 1927 who established this series are Issac Arnold, Alexander Hassan, and Isadore Fixman.

2019

Mar
12
2019
Class of '27 Lecture II-"Convergence Analysis of Stochastic Optimization Methods via Martingales"

Abstract:  We will present a very general framework for unconstrained stochastic optimization which encompasses standard frameworks such as line search and trust region using random models. In particular this framework retains the desirable practical features such step acceptance criterion, trust region adjustment and ability to utilize of second order models. The framework is based on bounding the expected stopping time of a stochastic process, which satisfies certain assumptions...

DCC 330 4:00 pm
View event flyer

Mar
11
2019
Class of '27 Lecture I-"Gradient Decent Without Gradients"

Abstract: The core of continuous optimization lies in using information from first and second order derivatives to produce steps that improve objective function value. Classical methods such as gradient decent and Newton method rely on this information. The recently popular method in machine learning - Stochastic Gradient Decent - does not require the gradient itself, but still requires its unbiased estimate. However, in many applications either derivatives or their unbiased estimates are not available. We will thus discuss a variety of methods which construct useful gradient approximations, both deterministic and stochastic, from only function values...

Amos Eaton 214 4:00 pm
View event flyer

2018

2016

Mar
22
2016
Class of ’27 Lecture : Coordinate Descent Methods

Coordinate descent is an approach for minimizing functions in which only a subset of the variables are allowed to change at each iteration, while the remaining variables are held fixed. This approach has been popular in applications since the earliest days of optimization, because it is intuitive and because the low-dimensional searches that take place at each iteration are inexpensive in many applications. In recent years, the popularity of coordinate descent methods has grown further because of their usefulness in data analysis. In this talk we describe situations in which coordinate descent methods are useful, and discuss several variants of these methods and their convergence properties. We describe recent analysis of the convergence of asynchronous parallel versions of these methods, which achieve high efficiency on multicore computers.

Amos Eaton 214 4:00 pm
View event flyer

Mar
21
2016
Class of ’27 Lecture: Fundamental Optimization Methods in Data Analysis

Optimization formulations and algorithms are vital tools for solving problems in data analysis. There has been particular interest in some fundamental, elementary, optimization algorithms that were previously thought to have only niche appeal. Stochastic gradient, coordinate descent, and accelerated first-order methods are three examples. We outline applications in which these approaches are useful, discuss the basic properties of these methods, and survey some recent developments in the analysis of their convergence behavior.

Amos Eaton 214 4:00 pm
View event flyer

2014

2011

2010

2009

Oct
20
2009
Corina Tarnita, Harvard University

2008

2006

2004

2003

1999

1996

Apr
20
1996
Marshall Slemrod, University of Wisconsin

Apr
19
1996
Marshall Slemrod, University of Wisconsin