Friday, March 2, 2012

Review of the size of Measurement Matrix in Compressed Sensing

* This article is only a review for my personal use. There may have some mistakes. Please do not trust in this article.
* If you noticed any mistakes in this article, please notify me. Thanks a lot.

In the following description, M is the number of measurements needed, N is the length of signal, K is sparsity, Φ is measurement matrix, Ψ is sparse basis. C is a constant.
$Y = AX = \Phi\Psi X$

The requirement for $M$
[1] Sparsity and Incoherence in compressive sampling
$M \geq C \cdot \mu^2(\Phi, \Psi) \cdot K {\text log}N$

[2] An introduction to Compressive Sampling
Form A obeying RIP i)-iv)
$M = O(K {\text log}(N/K))$
$M \geq C \cdot K {\text log}(N/K)$
i)-iii) see
[3] A simple proof of the restricted isometry property for random matrices
iv) see
[4] Uniform uncertainty principles for Bernoulli and sub-gaussian ensembles

Form A by first finding paris of incoherent orthobases $\Phi, \Psi$, and then exracting $M$ coordinates uniformly at random using R: $A = R\Phi\Psi$.
$M \geq C \cdot ({\text log}N)^4$
$M \geq C \cdot ({\text log}N)^5$ for a lower probability of failure
see [6] and [7] On sparse reconstruction from Fourier and Gaussian measurements

[6] Near-optimal signal recovery from random projections and universal encoding strategies
[8] Compressed Sensing, D.L.Donoho
[9] Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
[10] Neighborliness of randomly projected simplices in high dimensions
[11] High-dimensional centrally symmetreic polytopes with neighborliness proportional to dimension
$M = O(K {\text log}(N))$
$M \geq C \cdot K \cdot {\text log}N$

[12] Compressive Sensing, R.G.Baranuik
$M \geq C \cdot K {\text log}(N/K)$
it cited the result from [8] and [9]. Are they the same?

To summary, there are 4 expressions:
1)
$M = O(K {\text log}(N))$
$M \geq C \cdot K \cdot {\text log}N$
2)
$M \geq C \cdot K {\text log}(N/K)$ 
3)
$M \geq C \cdot ({\text log}N)^4$
$M \geq C \cdot ({\text log}N)^5$
4)
$M \geq C \cdot \mu^2(\Phi, \Psi) \cdot K {\text log}N$

1) and 2) are quite similar with each other; 1) is noisy situation and 2) is noiseless.
4) is quite similar with 1), except the parameter $\mu^2(\Phi, \Psi)$, which is a measure for the incoherence between the two matrix.

No comments:

Post a Comment