Send email Copy Email Address
2022-06-26

On (1 + ϵ)-Approximate Block Sparse Recovery

Summary

Learning approximately block sparse vectors using a small number of linear measurements is a standard task in the sparse recovery/compressed sensing literature. Schemes achieving a constant factor approximation are long known, e.g. using model-based RIP. We give a new scheme achieving (1+\epsilon) approximation, which runs in near linear time in the length of the vector and is likely to be optimal up to constant factors. As an intriguing side result, we obtain the simplest known scheme measurement-optimal \ell_2/\ell_2 sparse recovery scheme recorded in the literature. The main component of our algorithm is a subtle variant of the classic CountSketch data structure where the random signs are substituted by gaussians and the number of repetitions (rows) is tuned to smaller than usual.

Conference Paper

IEEE International Symposium on Information Theory (ISIT)

Date published

2022-06-26

Date last modified

2024-11-29