I’m a Ph.D. student at Paul G. Allen School of Computer Science and Engineering at University of Washington advised by Prof. Sewoong Oh. I am interested in machine learning and optimization broadly. Particularly in characterizing fundamental quantities, phenomena, and general laws that arise due to scale.
Currently, I am trying to understand the scaling limits of the standard Euclidean first-order stochastic optimization algorithms. Specifically, scaling limits of optimization processes over functions of large dense unlabeled weighted graphs (e.g., deep neural networks), which are invariant under vertex relabeling (i.e., permuting neurons within a layer). As a result, we show that these processes converge to well-defined gradient flows on the space of Graphons, as the scale of the problem increases. This is an attempt to generalize the Wasserstein calculus to higher-order exchangeable structures. See the project page for more details!
Prior to joining graduate school, I spent two wonderful years as a Research Fellow at Microsoft Research Lab - India. I was fortunate to work with Dr. Praneeth Netrapalli and Dr. Prateek Jain on provable non-convex optimization algorithms for Machine Learning and understanding several phenomena in deep learning. This experience sparked my current research interests and pursuits. Before the fellowship, I attended IIT Guwahati where I obtained my B.Tech. in Mathematics and Computing in 2017.
In my spare time, I like learning Mathematical Analysis, Geometry, Theoretical Physics, Modern Finance, Economics and Philosophy. I view myself as a disciple of mathematics, exploring its divine elegance across myriad of scientific realms and beyond .
Here is my Curriculum Vitae. Contact me at
 “तत् त्वम् असि।”