Gradient Flows on Graphons: Existence, Convergence, Continuity Equations - Journal of Theoretical Probability and the Optimal Transport and Machine Learning (OTML) workshop, Neural Information Processing Systems (NeurIPS), 2021 [paper] [arXiv] [bib]
Curriculum Vitae
Research Interests
I am interested in machine learning and optimization broadly. Throughout my research journey, I have been particularly interested in characterizing fundamental quantities, phenomena, and general laws that arise due to scale. With a switch to quantitative research, I like to study market behaviors, applying my research leads to craft systematic forecasts, and build risk-aware portfolios. Independently, I am also interested in exploring options/volatility trading in practice.
Education
- Ph.D. at Paul G. Allen School of Computer Science & Engineering, 2019 - 2024
University of Washington
Advisor: Prof. Sewoong Oh
Ph.D. Thesis: Scaling Limits of Algorithms on Large Matrices - M.S. in Computer Science and Engineering, 2022
Paul G. Allen School of Computer Science and Engineering, University of Washington
Supervisor: Prof. Sewoong Oh - B.Tech. in Mathematics and Computing, 2013 - 17
Indian Institute of Technology Guwahati
Work experience
- September 2024 - Present: Quantitative Analyst
- June 2023 - September 2023: Quantitative Analyst Ph.D. Intern
- The D. E. Shaw Group
- Group: Systematic Equities
- September 2019 - August 2024: Graduate Research Assistant
- Advisor: Prof. Sewoong Oh
- Machine Learning Lab
- July 2017 - July 2019: Research Fellow
- Summer 2016: Research Intern
- Microsoft Research Lab - India
- Supervisor: Dr. Sreangsu Acharyya
- Project: Clustered Monotone Transforms for Rating Factorization
- Summer 2015: Summer Research Intern
- CAFRAL - RBI
- Supervisor: Dr. N. R. Prabhala
- Project: Economic Policy Uncertainty Index - Intern report
Selected Publications
Complete list at Google Scholar, or dblp.
- Non-Gaussianity of Stochastic Gradient Noise - Science meets Engineering of Deep Learning (SEDL) workshop, Neural Information Processing Systems (NeurIPS), 2019 [arXiv] [bib]
- Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds - Neural Information Processing Systems (NeurIPS), 2018 [paper] [bib] [video]
Research Projects
- Scaling Limits of Algorithms on Large Matrices - University of Washington
- Scaling laws of optimization algorithms for Deep Learning - the Graphon perspective - University of Washington
- Robust Mixed Linear Regression using heterogeneous batches - University of Washington
- Connections between Stochasticity of SGD and Generalizability - Microsoft Research Lab - India
- Universality Patterns in the Training of Neural Networks - Microsoft Research Lab - India
- Sparse Regression and Support Recovery bounds for Orthogonal Matching Pursuit - Microsoft Research Lab - India
- Some Approaches of Building Recommendation Systems - Department of Mathematics, IIT Guwahati
Professional services
- Reviewed for NeurIPS ‘19, NeurIPS ‘20, ICML ‘20, JMLR.
Teaching
- Teaching assistant for CSE 446/546 Spring 2021.
- Teaching assistant for CSE 446 Spring 2024.