Hannah Lawrence

I am a research analyst at the Center for Computational Mathematics of the Flatiron Institute in New York, where I work on developing new fundamental algorithms at the interface of signal processing and deep learning, with the ultimate goal of cryoEM image reconstruction. Broadly, I enjoy working on mathematical algorithms for data science.

I spent the last summer at Microsoft Research, where I was lucky to be mentored by Cameron Musco (now at Amherst). I've also spent productive summers at Reservoir Labs and the Center for Computational Biology. I was an undergrad at Yale in applied math and computer science, where I had the extremely good fortune of being advised by Amin Karbasi and Dan Spielman.

Email  /  Github  /  LinkedIn  /  CV

profile photo
Research

My research interests include sparse recovery, theoretical machine learning, signal processing, numerical linear algebra, and spectral graph theory, and especially the applications in which some subset of these paradigms intersect. I like developing provably robust, efficient algorithms for inverse problems, sometimes in imaging applications.

Here are a few questions I've been thinking about recently:

  • What invariances should a message-passing neural net for multireference alignment obey?
  • What's the fastest way to rotationally align two spherical functions?
  • What generalizations of (1) the restricted isometry property and (2) leverage score sampling might be useful for off-grid sparse recovery?
  • What practical considerations determine the real-world utility of switching-constrained online optimization algorithms?

Minimax Regret of Switching-Constrained Online Convex Optimization: No Phase Transition
Lin Chen, Qian Yu, Hannah Lawrence, Amin Karbasi

We establish the minimax regret of switching-constrained online convex optimization, a realistic optimization framework where algorithms must act in real-time to minimize cumulative loss, but are penalized if they are too erratic.

Low-Rank Toeplitz Matrix Estimation via Random Ultra-Sparse Rulers
Hannah Lawrence, Jerry Li, Cameron Musco, Christopher Musco

By building new, randomized "ruler" sampling constructions, we show how to use sublinear sparse Fourier transform algorithms for sample efficient, low-rank, Toeplitz covariance estimation.

Service
Applied Math Departmental Student Advisory Committee, Spring 2019

Dean's Committee on Science and Quantitative Reasoning, Fall 2018

Undergraduate Learning Assistant, CS 365 (Design and Analysis of Algorithms), Spring 2018

Undergraduate Learning Assistant, CS 223 (Data Structures and Algorithms), Spring 2017

Undergraduate Learning Assistant, CS 201 (Introduction to Computer Science), Fall 2017

Website template credits.