Assistant Professor, UC Berkeley
Chemical Engineering and EECS
I am also a member of Berkeley AI Research (BAIR), and part of the AI+Science group in EECS and the theory group in Chemical Engineering.
Contact info: aditik1 dot berkeley dot edu
Research interests
I am interested in developing methods in machine learning that are driven by the distinct challenges and opportunities in the natural sciences, with particular interest in physics-inspired machine learning methods. Some areas of exploration include general learning strategies exploring the relevance of physical inductive biases and ML models for scientific problems, the advantages that ML can bring to classical physics-based numerical solvers (such as through end-to-end differentiable frameworks and implicit layers), and better learning strategies for distribution shifts in the physical sciences. These methods are informed by and grounded in applications in atomistic and continuum problems, including fluid mechanics, molecular dynamics, and other related areas. This work also includes interfacing with other fields including numerical analysis, dynamical systems theory, quantum mechanics, computational geometry, optimization, and category theory.
Some examples of recent work include:
-
thinking about principled design choices for scaling neural network interatomic potentials effectively (NeurIPS, 2024),
-
improving the stability of neural network interatomic potentials for molecular dynamics via a training procedure enabled by differentiable Boltzmann estimators (arxiv; 2024),
-
neural networks + spectral methods for solving PDEs: using orthogonal bases to learn transformations between spectral coefficients in a neural operator setting; leveraging Parseval’s identity to create a spectral loss function that allows training fully in spectral space (ICLR; 2024),
-
PDE-constrained optimization as a layer in a neural network (ICLR; 2023), scaled up approach with improved computational efficiency through representing PDE-constrained layer computations via mixture-of-experts (ICLR; 2024),
-
improving the efficiency of E(3)-equivariant operations in neural networks with a new tensor product formulation (ICLR, spotlight; 2024),
-
generative modeling for molecular conformers, and incorporating a coarse-graining procedure during training (BAIR Blog Post, arXiv:2306.14852; 2023),
A full list of publications is available on Google scholar.
Group Information
I am very fortunate to advise the following PhD students and postdocs:
Sanjeev Raja
Nithin Chalapathi
Rasmus H⌀egh
Toby Kreiman
Eric Qu
Yue Jian
Yiheng Du
Undergraduate students:
Ishan Amin
Divyam Goel
Ryan Liu
Former members:
Daniel Rothchild (UC Berkeley EECS PhD) → Research Scientist at Prescient Design (Genentech)
Danny Reidenbach (UC Berkeley MS CS) → Research Scientist at NVIDIA
Dami Fasina (visiting Applied Math PhD student, Yale University)
Nick Swenson (UC Berkeley EECS MEng) → Software engineer at Google
Joining the group:
- Incoming/current UC Berkeley PhD students and prospective postdoctoral researchers: please email me directly with your research interests and CV.
- Prospective PhD students: please apply directly to a UC Berkeley PhD program. If you apply to Chemical Engineering or EECS, you can mention my name as a faculty of interest in your application. For EECS applicants, please choose CS: AI-SCIENCE as your primary area.
- Undergraduate students: please fill out this application form.