I’m a Ph.D. student at Paul G. Allen School of Computer Science and Engineering at University of Washington advised by Prof. Sewoong Oh. I am interested in machine learning and optimization broadly. Particularly in characterizing fundamental quantities, phenomena, and general laws that arise due to scale.
Currently, I am trying to understand the scaling limits of the standard Euclidean first-order stochastic optimization algorithms. Specifically, scaling limits of optimization processes over functions of large dense unlabeled weighted graphs (e.g., deep neural networks), which are invariant under vertex relabeling (i.e., permuting neurons within a layer). As a result, these processes converge to a well-defined (deterministic) gradient flow on the space of Graphons, as the scale of the problem increases. This is an attempt to generalize the Wasserstein calculus to higher-order exchangeable structures.
Prior to joining graduate school, I spent two wonderful years as a Research Fellow at Microsoft Research Lab - India where I was fortunate to work with Dr. Praneeth Netrapalli and Dr. Prateek Jain on provable non-convex optimization algorithms for Machine Learning and understanding several phenomena in deep learning. Before the fellowship, I attended IIT Guwahati where I obtained my B.Tech. in Mathematics and Computing in 2017.
In my spare time, I like learning Mathematical Analysis, Geometry, Theoretical Physics, Modern Finance, and Economics. Personally and more broadly, I consider myself a student of mathematics, seeking its artistic elegance in various scientific subjects.
Here is my Curriculum Vitae. Contact me at