I received my Ph.D. from the Mathematics department at the University of Washington (2017) under Prof. Dmitriy Drusvyatskiy then I held a postdoctoral position in the Industrial and Systems Engineering at Lehigh University where I worked with Prof. Katya Scheinberg. In July 2018, I started my NSF postdoctoral fellowship in the Combinatorics and Optimization department at the University of Waterloo.

My research broadly focuses on designing and analyzing algorithms for large-scale optimization problems, motivated by applications in data science. The techniques I use draw from a variety of fields including probability, complexity theory, and convex and nonsmooth analysis.

Washington, Lehigh University, and Waterloo have strong optimization groups which spans across many departments: Math, Stats, CSE, EE, and ISE. If you are interested in optimization talks at these places, check out the following seminars:

EMAIL: yumiko88(at)uw(dot)edu or cop318(at)lehigh(dot)edu

OFFICE: University of Waterloo, Combinatorics and Optimization Department, MC, 5471


I study continuous optimization. My work has centered on various aspects of convex optimization with an emphasis on the continuous side, with connections to practical applications particularly machine learning. I work in variational analysis, a study which generalizes the concepts of differential analysis to functions that lack differentiability, but I also pursue research in first order algorithms to solve large sums of composition functions efficiently. My current work focuses on using first-order methods on structured non-convex and non-smooth problems.

You can view my CV here if you are interested in more details.

You can view my thesis titled: Structure and complexity in non-convex and nonsmooth optimization.



I have given talks on the research above at the following conferences:

  • (Upcoming February 2019) , Mathematics and Statistics, McGill University Montreal, QC (Feb. 2019);
  • (Upcoming January 2019) , Applied math and analysis seminar, Duke University Durham, NC (Jan. 2019);
  • Algorithms for stochastic problems lacking convexity or smoothness , Google Brain, Montreal Montreal, QC (Jan. 2019); My slides can be found here
  • An adaptive line search method for stochastic optimization, Cornell ORIE's Young Researchers Workshop (2018), Ithaca, NY (Oct. 2018); My slides can be found here
  • New analysis of adaptive stochastic optimization methods via supermartingales Part II: Convergence analysis for stochastic line search, Lehigh University DIMACS (2018), Bethlehem, PA (August. 2018); My slides can be found here
  • Generic Acceleration Schema Beyond Convexity , INFORMS annual meeting (2017), Houston, TX (Oct. 2017); My slides can be found here
  • Minimization of convex composite, Lehigh University Optimization Seminar, Bethlehem, PA (Sept 2017); My slides can be found here
  • Proximal methods for minimizing convex compositions, SIAM-optimization, Vancouver, BC (May 2017)
  • Catalyst for Gradient-based Nonconvex Optimization, Inria-Grenoble Seminar, Grenoble (April 2017)
  • Generic acceleration schema beyond convexity, Optimization and Statistical Learning, Les Houches (April 2017)
  • Proximal methods for minimizing convex compositions, West Coast Optimization Meeting, University of British Columbia (September 2016); My slides can be found here


Current Course:

Math 1152: Calculus II, Autumn 2017 Website

EMAIL: paquette.34(at)osu(dot)edu


Past Courses

I have taught the following courses:

  • Math 125 BC/BD: Calculus II Quiz Section, Winter 2017; course webpage
  • Math 307 E: Intro to Differential Equations, Winter 2016
  • Math 124 CC: Calculus 1, Autumn 2015
  • Math 307 I: Intro to Differential Equations, Spring 2015
  • Math 125 BA/BC: Calculus 2, Winter 2015
  • Math 307 K: Intro to Differential Equations, Autumn 2014
  • Math 307 L: Intro to Differential Equations, Spring 2014