Hannah Lawrence
I am a PhD student in machine learning at MIT, where I am fortunate to be advised by Ankur Moitra. I am also a member of the wonderful Atomic Architects, led by Tess Smidt. Previously, I was a summer research intern at DE Shaw Research, working on tokenization of small molecules for LLMs, and at the Open Catalyst Team at Meta FAIR, studying equivariant architecture for chemistry applications.
Before graduate school, I was a research analyst at the Center for Computational Mathematics of the Flatiron Institute in New York, where I worked on developing algorithms at the interface of equivariant deep learning and signal processing for cryoEM.
Broadly, I enjoy developing theoretically principled tools for deep learning (often in scientific domains), with a focus on both understanding and imposing structure for neural representations.
I spent summer 2019 at Microsoft Research, where I was lucky to be mentored by Cameron Musco. I've also spent productive summers at Reservoir Labs and the Center for Computational Biology. I was an undergrad at Yale in applied math and computer science, where I had the good fortune of being advised by Amin Karbasi and Dan Spielman.
Finally, I co-founded the Boston Symmetry Group, which hosts a recurring workshop for researchers interested in symmetries in machine learning. Follow us on Twitter, shoot us an email, or join our mailing list if you're interested in attending!
Email  / 
Github  / 
LinkedIn  / 
Twitter  / 
Google Scholar   
|
|
Research
My primary research interests include symmetry-aware (equivariant) machine learning and scientific applications. In addition, I enjoy developing theoretically principled tools for deep learning, for applications from vision to interpretability to PDEs.
Here is a non-exhaustive list of a few high-level questions I've been thinking about recently (or at least, the last time I updated this website, which was on 4/9/25):
- In the age of LLMs, what is the future of equivariant learning? (Here are some slides from a recent talk I gave, offering some perspective on this.)
- How can we probe how a network "thinks" by discovering structure in its hidden representations?
- What is the role of equivariance, e.g. to permutations, in large language models (LLMs)? To what extent is equivariance learned?
- What is the right way to tokenize geometric objects? How does tokenization transcend mere compression? What properties are desirable in a tokenization scheme?
- How can we make canonicalization work, in theory and in practice, as an approach for enforcing symmetries in black-box models?
- How much hot chocolate can I consume at a single research institution?
|
|
Positional Encodings as Group Representations: A Unified Framework
Derek Lim*,
Hannah Lawrence,
Ningyuan (Teresa) Huang,
Erik H. Thiede
ICML TAG-ML Workshop, 2023.
We observe that many popular positional encodings (sinusoidal, ROPE, graph PEs, etc) can be interpreted as algebraic group representations, which formalizes some of their desirable properties (invariance to global translation, etc). This also suggests a simple framework for building positional encodings with new invariances, such as the special euclidean group.
|
|
Artificial Intelligence for Science in
Quantum, Atomistic, and Continuum
Systems
Xuan Zhang*,
Limei Wang*,
Jacob Helwig*,
Youzhi Luo*,
Cong Fu*,
Yaochen Xie*,
...,
Hannah Lawrence,
...,
Shuiwang Ji
Under review, 2023.
A survey of machine learning for physics.
|
 |
Organizer, Boston Symmetry Day, Fall 2023 - Present
Teaching Assistant, 6.S966 Symmetry and its Applications to Machine Learning, Spring 2023
Hertz Foundation Summer Workshop Committee, Fall 2021 and Spring 2022
Women in Learning Theory Mentor, Spring 2020
Applied Math Departmental Student Advisory Committee, Spring 2019
Dean's Committee on Science and Quantitative Reasoning, Fall 2018
Undergraduate Learning Assistant, CS 365 (Design and Analysis of Algorithms), Spring 2018
Undergraduate Learning Assistant, CS 223 (Data Structures and Algorithms), Spring 2017
Undergraduate Learning Assistant, CS 201 (Introduction to Computer Science), Fall 2017
|
|