Assistant Professor, UC Berkeley

Chemical Engineering and EECS

I am also a member of Berkeley AI Research (BAIR), and part of the AI+Science group in EECS and the theory group in Chemical Engineering.

Contact info: aditik1 dot berkeley dot edu


Research interests

I am interested in developing methods in machine learning that are driven by the distinct challenges and opportunities in the natural sciences, with particular interest in physics-inspired machine learning methods. Some areas of exploration include general learning strategies exploring the role of physical inductive biases (such as symmetries, conservation laws) into ML models to improve generalization for scientific problems, the advantages that ML can bring to classical physics-based numerical solvers (such as through end-to-end differentiable frameworks and implicit layers), and better learning strategies for distribution shifts in the physical sciences. These methods are informed by and grounded in applications in atomistic and continuum problems, including fluid mechanics, molecular dynamics, materials design, and other related areas. This work also includes interfacing with other fields including numerical analysis, dynamical systems theory, quantum mechanics, computational geometry, optimization, and category theory.

Some examples of recent work include:

  • improving the stability of neural network interatomic potentials for molecular dynamics via a training procedure enabled by differentiable Boltzmann estimators (arxiv; 2024),

  • neural networks + spectral methods for solving PDEs: using orthogonal bases to learn transformations between spectral coefficients in a neural operator setting; leveraging Parseval’s identity to create a spectral loss function that allows training fully in spectral space (ICLR; 2024),

  • PDE-constrained optimization as a layer in a neural network (ICLR; 2023), scaled up approach with improved computational efficiency through representing PDE-constrained layer computations via mixture-of-experts (ICLR; 2024),

  • improving the efficiency of E(3)-equivariant operations in neural networks with a new tensor product formulation (ICLR, spotlight; 2024),

  • generative modeling for molecular conformers, and incorporating a new coarse-graining procedure in the training (BAIR Blog Post, arXiv:2306.14852; 2023),

A full list of publications is available on Google scholar.


Group Information

I am very fortunate to advise the following PhD students and postdocs:

Daniel Rothchild
Sanjeev Raja
Nithin Chalapathi
Rasmus H⌀egh
Toby Kreiman
Eric Qu
Yue Jian

Undergraduate students:

Yiheng Du
Ishan Amin
Divyam Goel
Tianlang Chen

Former members:

Danny Reidenbach (UC Berkeley MS CS) → Research Scientist at NVIDIA
Dami Fasina (visiting Applied Math PhD student, Yale University)
Nick Swenson (UC Berkeley EECS MEng) → Software engineer at Google

Joining the group:

  • Incoming/current UC Berkeley PhD students and prospective postdoctoral researchers: please email me directly with your research interests and CV.
  • Prospective PhD students: please apply directly to a UC Berkeley PhD program. If you apply to Chemical Engineering or EECS, you can mention my name as a faculty of interest in your application. For EECS applicants, please choose CS: AI-SCIENCE as your primary area.