Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds
Published at Neural Information Processing Systems (NeurIPS), 2018
This paper studies the problem of sparse regression where the goal is to learn a sparse vector that best optimizes a given objective function. Under the assumption that the objective function satisfies restricted strong convexity (RSC), we analyze Orthogonal Matching Pursuit (OMP), a greedy algorithm that is used heavily in applications, and obtain support recovery result as well as a tight generalization error bound for OMP. Furthermore, we obtain lower bounds for OMP, showing that both our results on support recovery and generalization error are tight up to logarithmic factors. To the best of our knowledge, these support recovery and generalization bounds are the first such matching upper and lower bounds (up to logarithmic factors) for any sparse regression algorithm under the RSC assumption.
The paper has been accepted for Spotlight presentation at the conference. Spotlight - 168/4856 submissions ≈ 3.5% Acceptance Rate.
Please find the below resources:
- Proceedings and paper.
- NeurIPS poster.
- Spotlight presentation slides.
- Spotlight presentation video (from 1:36:20).
Leave a Comment