You are kindly invited to attend my Ph.D. proposal presentation. Please see the details below.
Title: Local Acceleration of the Frank-Wolfe Algorithm
Date: May 11, 2021
Time: 10:00 AM EST
Machine Learning Ph.D. Student
School of Industrial and Systems Engineering
Georgia Institute of Technology
Sebastian Pokutta (Advisor)
Guanghui (George) Lan
Conditional Gradient (CG) algorithms are an important class of constrained optimization algorithms that eschew the need for projections, relying instead on linear programming oracles to ensure feasibility. In this proposal, we present two algorithms from this family, both of which show improved local convergence rates in the vicinity of the optimum when minimizing smooth and strongly convex functions. The first one, dubbed Locally Accelerated Conditional Gradients uses first-order information about the function and has a local linear convergence over polytopes that improves over that of existing CG-variants. The second algorithm, the Second Order Conditional Gradient Sliding algorithm (inspired by the Conditional Gradient Sliding algorithm), uses second-order information to obtain local quadratic convergence over polytopes under a set of assumptions.