Learning approximately block sparse vectors using a small number of linear measurements is a standard task in the sparse recovery/compressed sensing literature. Schemes achieving a constant factor approximation are long known, e.g. using model-based RIP. We give a new scheme achieving (1+\epsilon) approximation, which runs in near linear time in the length of the vector and is likely to be optimal up to constant factors. As an intriguing side result, we obtain the simplest known scheme measurement-optimal \ell_2/\ell_2 sparse recovery scheme recorded in the literature. The main component of our algorithm is a subtle variant of the classic CountSketch data structure where the random signs are substituted by gaussians and the number of repetitions (rows) is tuned to smaller than usual.
History
Preferred Citation
Baris Esmer and Vasileios Nakos. On (1 + ϵ)-Approximate Block Sparse Recovery. In: IEEE International Symposium on Information Theory (ISIT). 2022.
Primary Research Area
Algorithmic Foundations and Cryptography
Name of Conference
IEEE International Symposium on Information Theory (ISIT)
Legacy Posted Date
2022-05-05
Open Access Type
Unknown
BibTeX
@inproceedings{cispa_all_3648,
title = "On (1 + ϵ)-Approximate Block Sparse Recovery",
author = "Esmer, Baris Can and Nakos, Vasileios",
booktitle="{IEEE International Symposium on Information Theory (ISIT)}",
year="2022",
}