About

My research focuses on the methodological foundations, theory, and fairness in machine learning. I am interested in developing frameworks for algorithmic assessment, providing rigorous guarantees for algorithmic performance, and understanding and mitigating the disparate impact of machine learning.

I am currently a postdoc at Microsoft Research, New England. Before moving to Microsoft in 2018, I completed my Ph.D at UC Berkeley with Michael Jordan and Benjamin Recht on optimization algorithms and dynamical systems. Before that, I studied applied mathematics and minored in philosophy at Harvard University. In my free time, I enjoy reading, listening to music, laughing with friends, and taking long walks.

Research

Preprints

A Lyapunov analysis of momentum methods in optimization
(joint with Michael Jordan and Benjamin Recht)
Link

On symplectic optimization
(joint with Michael Betancourt and Michael Jordan)
Link


Publications

Approximate cross-validation: guarantees for model assessment and selection
(joint with Maximillian Kasy and Lester Mackey)
To appear at the International Conference on Artificial Intelligence and Statistics (AISTATS). 2020.
Link


The disparate equilibria of algorithmic decision making when individuals invest rationally
(joint with Lydia Liu, Nika Haghtalab, Adam Kalai, Christian Borgs, and Jennifer Chayes)
ACM conference on Fairness, Accountability and Transparency (FAT*). 2020.
Link

Accelerating rescaled gradient descent: fast minimization of smooth functions
(joint with Lester Mackey and Andre Wibisono)
In Advances in Neural Information Processing Systems (NeurIPS). 2019.
Link

Posteriors, conjugacy, and exponential families for completely random measures
(joint with Tamara Broderick and Michael Jordan)
In Bernoulli. 2018
Link

The marginal value of adaptive methods in machine learning
(joint with Becca Roeloffs, Mitchell Stern, Nathan Sebro and Benjamin Recht)
In Advances in Neural Information Processing Systems (NeurIPS) (Spotlight Presentation). 2017.
Link

Breaking locality accelerates block Gauss-Seidel
(joint with Stephen Tu, Shivaram Venkataraman, Michael Jordan and Benjamin Recht)
In the International Conference of Machine Learning (ICML). 2017.
Link

A variational perspective on accelerated methods of optimization (joint with Andre Wibisono and Michael Jordan)
In Proceedings of the National Academy of Science (PNAS). 2016.
Link

Streaming, variational, Bayes
(joint with Tamara Broderick, Nicholas Boyd, Andre Wibisono and Michael Jordan)
In Advances in Neural Information Processing Systems (NeurIPS). 2014
Link