I am excited to announce, for one year, starting in Sept. 2019, I will be working as a Research Scientist at Google Brain, Montreal. Then (starting 2020), I will be joining the Mathematics and Statistics department at McGill University as an assistant professor.
I received my Ph.D. from the Mathematics department at the University of Washington (2017) under Prof. Dmitriy Drusvyatskiy then I held a postdoctoral position in the Industrial and Systems Engineering at Lehigh University where I worked with Prof. Katya Scheinberg. I held an NSF postdoctoral fellowship (2018-2019) under Prof. Stephen Vavasis in the Combinatorics and Optimization Department at the University of Waterloo.
My research broadly focuses on designing and analyzing algorithms for large-scale optimization problems, motivated by applications in data science. The techniques I use draw from a variety of fields including probability, complexity theory, and convex and nonsmooth analysis.
Washington, Lehigh University, and Waterloo have strong optimization groups which spans across many departments: Math, Stats, CSE, EE, and ISE. If you are interested in optimization talks at these places, check out the following seminars:
- Trends in Optimization Seminar (TOPS/CORE) at Washington
- COR@L at Lehigh University
- Combinatorics and Optimization at Waterloo
EMAIL: yumiko88(at)uw(dot)edu or email@example.com
I study continuous optimization. My work has centered on various aspects of convex optimization with an emphasis on the continuous side, with connections to practical applications particularly machine learning. I work in variational analysis, a study which generalizes the concepts of differential analysis to functions that lack differentiability, but I also pursue research in first order algorithms to solve large sums of composition functions efficiently. My current work focuses on using first-order methods on structured non-convex and non-smooth problems.
You can view my CV here if you are interested in more details.
You can view my thesis titled: Structure and complexity in non-convex and nonsmooth optimization.
- C. Paquette and S. Vavasis. Potential-based analyses of first-order methods for constrained and composite optimization. arXiv (2019) (Submitted to Math. Prog.)
- C. Paquette and K. Scheinberg. A stochastic line-search method with convergence rate. arXiv (2018) (Submitted to SIAM J. Opt.)
- D. Davis, D. Drusvyatskiy, K. MacPhee, and C. Paquette. Subgradient methods for sharp weakly convex functions. To appear in J. Optim. Theory App. (2018)
- D. Davis, D. Drusvyatskiy, and C. Paquette. The nonsmooth landscape of phase retrieval. To appear in IMA J. Numer. Anal. (2018)
- C. Paquette, H. Lin, D. Drusvyatskiy, J. Mairal, and Z. Harchaoui. Acceleration for Gradient-Based Non-Convex Optimization. 22nd International Conference on Artificial Intelligence and Statistics (AISTATS 2018)
- D. Drusvyatskiy and C. Paquette. Efficiency of minimizing compositions of convex functions and smooth maps. Math. Program. (2018)
- D. Drusvyatskiy and C. Paquette. Variational analysis of spectral functions simplified. J. Convex Anal. 25(1), 2018.
I have given talks on the research above at the following conferences:
- Stochastic Optimization: summer school talk , ADSI Summer School on Foundations of Data Science Seattle, WA (Aug. 2019); My notes can be found here
- Algorithms for stochastic problems lacking convexity or smoothness , Mathematics Colloquium, Ohio State University Columbus, OH (Feb. 2019); My slides can be found here
- Algorithms for stochastic problems lacking convexity or smoothness , Applied Mathematics Colloquium, Brown University Providence, RI (Feb. 2019); My slides can be found here
- Algorithms for stochastic problems lacking convexity or smoothness , Applied Mathematics Seminar, McGill University Montreal, QC (Feb. 2019); My slides can be found here
- Algorithms for stochastic problems lacking convexity or smoothness , Applied math and analysis seminar, Duke University Durham, NC (Jan. 2019); My slides can be found here
- Algorithms for stochastic problems lacking convexity or smoothness , Google Brain, Montreal Montreal, QC (Jan. 2019); My slides can be found here
- An adaptive line search method for stochastic optimization, Cornell ORIE's Young Researchers Workshop (2018), Ithaca, NY (Oct. 2018); My slides can be found here
- New analysis of adaptive stochastic optimization methods via supermartingales Part II: Convergence analysis for stochastic line search, Lehigh University DIMACS (2018), Bethlehem, PA (August. 2018); My slides can be found here
- Generic Acceleration Schema Beyond Convexity , INFORMS annual meeting (2017), Houston, TX (Oct. 2017); My slides can be found here
- Minimization of convex composite, Lehigh University Optimization Seminar, Bethlehem, PA (Sept 2017); My slides can be found here
- Proximal methods for minimizing convex compositions, SIAM-optimization, Vancouver, BC (May 2017)
- Catalyst for Gradient-based Nonconvex Optimization, Inria-Grenoble Seminar, Grenoble (April 2017)
- Generic acceleration schema beyond convexity, Optimization and Statistical Learning, Les Houches (April 2017)
- Proximal methods for minimizing convex compositions, West Coast Optimization Meeting, University of British Columbia (September 2016); My slides can be found here
Math 1152: Calculus II, Autumn 2017 Website
I have taught the following courses:
- Math 125 BC/BD: Calculus II Quiz Section, Winter 2017; course webpage
- Math 307 E: Intro to Differential Equations, Winter 2016
- Math 124 CC: Calculus 1, Autumn 2015
- Math 307 I: Intro to Differential Equations, Spring 2015
- Math 125 BA/BC: Calculus 2, Winter 2015
- Math 307 K: Intro to Differential Equations, Autumn 2014
- Math 307 L: Intro to Differential Equations, Spring 2014