Shashank Sule

I’m Shashank and I am a sixth-year PhD student in the Applied Mathematics, Applied Statistics and Scientific Computation program at the University of Maryland, College Park. I am jointly advised by Dr. Wojciech Czaja and Dr. Maria Cameron. I use tools from applied analysis to design and analyse algorithms for manifold learning, deep learning, and explainability in molecular dynamics and medical imaging applications. Recently I’ve been interested in equivariant graph neural networks, neural collapse, and bilevel optimization. Before Maryland, I graduated from Amherst College with a degree in mathematics.

Find my latest CV, find me on LinkedIn email me at ssule25[at]umd[dot]edu.

Selected publications

  1. Learning collective variables that preserve transition rates (2025). Shashank Sule, Arnav Mehta, Maria Cameron. In review at SIAM Multiscale Modeling and Simulation. arXiv.
  2. Sharp estimates for target measure diffusion maps and applications to the committor problem (2025). Shashank Sule, Luke Evans, K. Maria Cameron. Applied and Computational Harmonic Analysis, Volume 79, 101803, ISSN 1063-5203.
  3. On the limits of neural network explainability via descrambling (2025). Shashank Sule, Richard G. Spencer, Wojciech Czaja. Applied and Computational Harmonic Analysis, Volume 79, 2025, 101793, ISSN 1063-5203.

Recent posts

More papers!

posted on 21 Feb 2026

They say journal acceptances are like buses1–for a long while you don’t see any, and then two come along at once. To be more specific, this week two of my papers were accepted to the journals NMR in Biomedicine (NMRB) and SIAM Multiscale Modeling and Simulation (SIAM MMS) respectively. These papers are about:

  1. Input Layer Regularization (NMRB): How much information do classical statistical estimators hold about the underlying random variable, and can a neural network extract this information? We study this problem in the context of Myelin Water Imaging and empirically demonstrate that sometimes classical statistical estimators (like Generalized Cros Validation) can work better than copious amounts of deep learning. Read the paper, jointly authored with Richard Spencer’s group at the NIH, here on arXiv.

  2. Collective variables that reproduce rates (SIAM MMS): A case study paper on the butane molecule, where we turn Legoll and Lelievre’s quantitative coarse-graining theory into an algorithm for learning collective variables that preserve dynamics of molecular systems. This paper has lots of independent insights on group invariant machine learning and manifold learning. Read the paper, jointly authored with Arnav Mehta and Maria Cameron here.

  1. See here

Collective variable discovery

posted on 5 Jul 2025

Last month marked the culmination of a two major projects regarding collective variable discovery, a fundamental interdisciplinary problem in drug discovery, computational statistical physics, and stochastic processes. From a probabilistic perpsective, this problem asks: how can we map a stochastic process to low dimensions and still preserve its statistics? In two case study-style papers on the butane molecule and Lennard-Jones clusters we provide some answers by resorting to quantitive coarse graining theory and proposing algorithms that use some of my favourite tools from geometric data science.

To access older posts, click here