Journal of Computational Information Systems 10: 16 (2014) 7077–7084 Available at

Journal of Computational Information Systems 10: 16 (2014) 7077–7084
Available at http://www.Jofcis.com
Backtracking Iterative Thresholding Matching Pursuit
Algorithm ⋆
Guiling SUN ∗,
Chen YU, Yan LIN, Wei DANG
College of Information Technical Science, Nankai University, Tianjin 300071, China
Abstract
According to the theory of Compressive Sensing (CS), we propose a BITMP algorithm, which combine the
backtracking iterative thresholding algorithm and matching pursuit algorithm, realize the self adapation
of the signal sparsity, and have strong robustness to noisy signal. Matlab simulation experiments have
shown that BITMP has many advantages in the probability and time of reconstruction when compared
with iterative thresholding type and matching pursuit type algorithms, and it will promote compressive
sensing to be used in practical application.
Keywords: Compressive Sensing; Sparsity Adaptive; Matching Pursuit; Iterative Thresholding;
Reconstruction Algorithm
1
Introduction
Candes has proven that original signal can be reconstructed exactly from parts of Fourier transform coefficients in 2006 [1], then the theory of CS was proposed by Candes and Donoho [2]. The
core of CS is the combination of compression and sampling. After collected the linear projection
of signal, we can reconstruct original signal from the measurement by relevant algorithm. CS
breaks through the limit of Shannon Sampling Theorem, and makes it possible to collect high
resolution signal. CS shows huge potential applying in many areas such as medicine technology
[3] and remote sensing [4].
CS theory mainly includes signal sparsity representation, encoding measurement and reconstruction algorithm. Reconstruction algorithm has significant influence in recovering the signal
accurately. Iterative greedy algorithms are applied widely with their fast speed and simple structure. Representative algorithms are MP [5] and OMP [6]. Then the improved method, ROMP
[7] was proposed, it has good performance for all measurement matrices satisfying the Restricted
⋆
Project supported by the National Nature Science Foundation of China (No. 61171140); Project supported
by the Specialized Research Fund for the Doctoral Program of Higher Education (No. 20130031110032); Project
supported by Tianjin Key Technology Program of the Ministry of Science and Technology (No. 14ZCZDNC00014).
∗
Corresponding author.
Email address: [email protected] (Guiling SUN).
1553–9105 / Copyright © 2014 Binary Information Press
DOI: 10.12733/jcis11465
Augest 15, 2014
7078
G. Sun et al. /Journal of Computational Information Systems 10: 16 (2014) 7077–7084
Isometry Property and sparse signals. SP [8] and CoSaMP [9] introduced the idea of backtracking. Focusing on the situation of signal sparsity unknown, SAMP [10] was adopted to solve the
problem.
Considering uncertainty of signal sparsity in most situations, this paper proposes Backtracking
Iterative Thresholding Matching Pursuit (BITMP) Algorithm. It brings backtracking process in
iterative thresholding algorithm, and combines matching pursuit method, recovery signal accurately when sparsity is unknown, and has performance improvement in computation efficiency
and stability.
2
Compressive Sensing and Reconstruction Algorithm
Set x as a N -dimension signal. If there is an orthogonal transform Ψ ∈ RN ×N which can make x
become a K sparsity signal, a M -dimension vector y is generated by linear measurement Θ. K,
M , N are satisfied K≤M≪N. The dimension of y is much smaller than x, so the goal of sampling
signal with a dimension well below the original signal is realized. CS transforms the matter of
signal recovery into solving a optimization problem in restrained condition:
min∥ x ∥l0 , s.t.Φx = y
(1)
l0 norm is the number of non-zero elements in x, Φ = ΘΨ. Data on the K sparsity position
contain all the information of x, and the dimension of measurement vector y is able to ensure
all these data reserved. The purpose of CS that we can get information from signal directly has
been achieved. The solution of that problem is a NP-hard problem. CS theory open a new path
to overcome the difficulty.
Signal recovery algorithm of CS includes about three types. The first is iterative greedy algorithm. Pick a local optimal solution to approximate original signal step by step in each iteration.
The second is convex relaxation method. Find optimal approximation of original signal by transforming non-convex problem to convex problem. Minimum l0 norm method and matching pursuit
do not solve l0 norm problem directly. Aiming at this problem, IHT [11] and NIHT [12] was put
forward. The algorithm converges to sub-optimal solution due to non-convexity of l0 norm problem. So these algorithms are sensitive to initial value. The third is combination method. These
methods require that sampling supports reconstruct signal by grouping test.
3
3.1
Backtracking Iterative Thresholding Matching Pursuit
Algorithm
Backtracking iterative thresholding method
IHT aims at solving the optimization problem which restrained condition below is satisfied:
min∥ y − Φx ∥22 , s.t.∥ x ∥l0 ≤ K
(2)
K is the sparsity of signal. IHT has proved that the algorithm converges to a local minimum of
optimization problem. It solves l0 norm problem which is called NP-hard problem, considers the
G. Sun et al. /Journal of Computational Information Systems 10: 16 (2014) 7077–7084
7079
number of non-zero elements in sparse signal and find solution which approximates to original
signal best. Recurrence formula of IHT is:
xm+1 = HK (xm + ΦT (y − Φxm ))
(3)
HK (ξ) is a non-linear operator that retain the first K higher absolute value in vector ξ and
set others zero. IHT puts K vectors into candidate list by projecting signal residual on the
most matchable K vectors in the atom dictionary which is made up of the column of Φ in each
iteration and turns to the next iteration. This may lead to a result that an element has been
setted zero in earlier iteration, but reserved due to residual projection and it is in the K largest
absolute value in later iteration. It means that some elements may be selected and discarded
again and again. So IHT may reconstruct signal after excessive iterations. It results in the
increase of computation complexity and reconstruction time. It also has bad influence on the
accuracy of algorithm. Inspired by CoSaMP, the idea of backtracking is added into IHT. A
step of re-estimating candidate reliability is put into the process of removing and adding vectors
in iterations. In each iteration, the nonlinear atom principle is retained to remove vectors from
candidate list. It combines the result of iteration last time and new vector support from nonlinear
operator HK (ξ) by adding backtracking step, then re-estimate the result by calculating pseudoinverse and nonlinear operator HK (ξ). Although this pattern had one more step than original
algorithm, it avoids repeating selection of atoms, and has a great improvement in efficiency,
stability and accuracy. Algorithm idea is shown as follows:
(1) Iteration number j = 1, signal residual r1 = y, x1 = 0.
(2) Do the thresholding iteration, γj+1 = HK (xj + ΦT rj ), combine the new indexes and indexes
which is selected in last iteration, Λj = supp(xj ) ∪ p(γj+1 ).
(3) Do the backtracking, choose the best support set index, reestimate the signal.
x˜j+1 = Φ†Λj y, xj+1 = HK (˜
xj+1 ). Compute residual rj+1 = y − Φxj+1 . If the stop criterion is
satisfied now, quite the iteration, otherwise, go to (2).
3.2
BITMP algorithm
Initial value setting of iterative thresholding will influence the performance of algorithm. We combines the modified iterative thresholding with matching pursuit. First select atoms which match
with the signal from over-complete atom dictionary, then calculate vector initial value needed
by least square method. Using the initial value in modified iterative thresholding, algorithm
performance improved obviously in accuracy, stability and computation efficiency. Traditional
iterative thresholding method can not realize self-adaption of signal sparsity. Adopting dividing
and conquering, pursuit optimal support set in different sparsity, and extend support set using
intelligent section during the process of pursuit. Stage transition is decided by residual change.
If signal residual is larger than last step in this stage, the estimation of signal is already optimal,
and if the command of recovery accuracy has not been satisfied, it will step into next stage and
update the capacity of support set. In this way, signal sparsity will be estimated step by step,
the self-adaption of signal sparsity is realized and original signal will be recovered accurately. It
is an important issue that when the reconstruction accuracy is satisfied to stop the computation.
We hope that the recovered signal is extremely approximate to original signal, and increase of
computation complexity caused by excessive calculation can be avoided. The step of BITMP is
shown as follows:
7080
G. Sun et al. /Journal of Computational Information Systems 10: 16 (2014) 7077–7084
(1) Stage number i = 1, iteration number m = 1, signal residual r1 = y, support set F = ∅,
step size s, support set size I = s.
(2) Find I vectors which have most correlated with signal residual, their indexes are recorded
in Cm , Cm = max(|Φrm |, I), F = F ∪ Cm .
(3) Compute the estimated initial value of signal by least square method x0 = Φ†F y.
(4)x0 is regarded as iteration initial value, compute reconstruction signal xˆ using BITMP. If
signal residual ∥r˜m ∥2 ≥ ∥rm ∥2 , go to (5), otherwise, go to (6).
(5) Update stage, I = I + s, m = m + 1, i = i + 1, go to (2).
(6) Update signal residual rm = r˜m , m = m + 1, go to (2).
(7) Output reconstruction signal xˆ.
3.3
Theory analysis
Analysis the signal recovery accuracy of BITMP. [11] has already proved the convergence of
iterative thresholding algorithm. Next we will prove the ability of signal reconstruction using
BITMP which adds backtracking in algorithm. [13] has proved that measurement matrix satisfied
RIP criterion which can be expressed as:
(1 − δ)∥x∥2 ≤ ∥Φx∥2 ≤ (1 + δ)∥x∥2
(4)
(1 − δ)∥x∥ ≤ ∥ΦT y∥ ≤ (1 + δ)∥x∥
(5)
Due to y = Φx.
Residual in the m − th iteration can be expressed as :
/
/
rm = y − ΦFm ΦFm daggery = ΦΥ xΥ + ΦΓ xΓ − ΦFm ΦFm dagger(ΦΥ xΥ + ΦΓ xΓ )
= ΦΓ xΓ − ΦFm ΦFm dagger(ΦΓ xΓ ) = ΦΓ xΓ − ΦFm ((ΦTFm ΦFm )† ΦTFm ΦΓ xΓ )
/
(6)
Fm is the atom index set after the m − th iteration, Υ is the atom index set which is selected in
the m − th and retained in the final list, Γ is the atom index set which has been deleted in the
m − th iteration. Let
x∆ = (ΦTFm ΦFm )† ΦTFm ΦΓ xΓ
(7)
From (5),
∥ΦT Φ∥ ≥ 1 − δ
(8)
[13] has shown if Fm and Γ do not have intersection set:
∥ΦTFm ΦΓ xΓ ∥ ≤ δ∥xΓ ∥
(9)
(1 − δ)∥x∆ ∥ ≤ δ∥xΓ ∥
(10)
Then
If atoms in Ω was not selected in the final index support set, obviously, ∥ΦTΩ rm ∥ ≥ ∥ΦTΓ rm ∥, and
∥ΦTΩ rm ∥ = ∥ΦTΩ (ΦΓ xΓ − ΦFn x∆ )∥ ≤ ∥ΦTΩ ΦΓ xΓ ∥ + ∥ΦTΩ ΦFm x∆ ∥
(11)
∥ΦTΓ rm ∥ = ∥ΦTΓ (ΦΓ xΓ − ΦFm x∆ )∥ ≤ ∥ΦTΓ ΦΓ xΓ ∥ + ∥ΦTΓ ΦFm x∆ ∥
(12)
G. Sun et al. /Journal of Computational Information Systems 10: 16 (2014) 7077–7084
7081
Then
(δ − δ 2 )∥xΓ ∥ + δ 2 ∥xΓ ∥ > (1 − δ)2 ∥xΓ ∥ − δ 2 ∥xΓ ∥
1
.
3
(13)
1
3
We can deduce that δ >
But δ < according to [8]. The assumption is not true. It indicates
that BITMP can reconstruct original signal accurately although it deletes some atoms in the
progress of backtracking.
Fig. 1: Reconstruction signal of iterative dif-
Fig. 2: Reconstruction error of iterative differ-
ferent algorithms
ent algorithms
Fig. 3: Signal reconstruction rate in different M of different algorithms
Table 1: Reconstruction performance of different algorithms
4
4.1
Algorithm
Recovery Probability
Recovery Times(s)
Recovery Error
IHT
0.58
0.0413
3.179 × 10−6
NIHT
0.42
0.0017
7.64 × 10−7
BITMP
0.90
0.0030
1.178 × 10−15
Simulation and Analysis
Comparison with iterative thresholding type algorithms
Simulate BITMP based on MATLAB and take iterative thresholding type and matching pursuit
type algorithms as referents to compare and analysis them in various aspects. Simulation results
7082
G. Sun et al. /Journal of Computational Information Systems 10: 16 (2014) 7077–7084
Signal Reconstruction Rate
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
OMP
CoSaMP
SAMP
BITMP
0.2
0.1
0
0
10
20
30
40
Signal Sparsity K
50
60
70
Fig. 4: Signal reconstruction rate in different signal sparsity of different algorithms
Reconstruction Time
0.4
OMP
CoSaMP
SAMP
BITMP
0.35
0.3
0.25
0.2
0.15
0.1
0.05
0
0
10
20
30
40
Signal Sparsity K
50
60
70
Fig. 5: Reconstruction time in different signal sparsity of different algorithms
shows that BITMP has better performance than other algorithms in reconstruction time and
accuracy.
Choose N = 256 dimension unknown sparsity frequency domain noisy signal and 128 × 256
normalized random Gauss matrix as Fig. 1 and Fig. 2 show. Compare the recovered signal and
reconstruction error using IHT, NIHT and BITMP. The figures show that IHT can not recover
original signal well and the error is big. NIHT can recover the signal basicly. In comparison
to the two algorithms above, the reconstruction accuracy of BITMP reachs 10−16 , it has strong
robustness to noisy signal.
We still use dimension N = 256 unknown sparsity frequency domain noisy signal as above.
Compare the signal recovery performance of IHT, NIHT and BITMP. When the value of M in
measurement matrix increases gradually from 10 to 150, the reconstruction of signal is getting
G. Sun et al. /Journal of Computational Information Systems 10: 16 (2014) 7077–7084
7083
well, but the performance is bad when the value of M is small. The recovery of signal using NIHT
is in a low level. BITMP has obvious advantage in comparison to IHT and NIHT.
Choose dimension N = 256 binary sparse signal, sparsity K=3, measurement dimension 128 ×
256 as Table 1 has shown. Compare the performance of signal reconstruction using IHT, NIHT
and BITMP. We can see that the performance of BITMP is superior to other two algorithms in
recovery probability and recovery error. The recovery time of BITMP and NIHT are of the same
order of magnitude and superior to IHT obviously.
4.2
Comparison with matching pursuit type algorithms
Choose dimension N = 256 binary sparse signal, the dimension of measurement matrix is
128 × 256. Choose 5 as the step size of SAMP and BITMP. Compare the result of signal reconstruction using OMP, CoSaMP, SAMP and BITMP. Although the recovery probabilities of
these algorithms show a decreasing tendency when the signal sparsity is increasing, BITMP shows
obvious advantage in comparison to OMP, CoSaMP and SAMP. When signal sparsity is larger
than 65, the probabilities of signal recovery by other algorithms is close to zero, but BITMP can
still reconstruct original signal.
The recovery time of these algorithms is shown as Fig. 5. The recovery time of BITMP is ahead
of OMP and SAMP, and equal to CoSaMP. It shows that BITMP do not obtain high probability
of signal recovery at the cost of computation efficiency.
5
Conclusion
Deeply research on CS theory and different reconstruction algorithms, we propose a backtracking
iterative thresholding matching pursuit algorithm. BITMP bases on iterative thresholding algorithm, adds the step of backtracking, improves the accuracy and efficiency, combine it with other
algorithms, realizes the self-adaptibility of signal sparsity, and has strong robustness to noisy
signal. It overcomes the drawback of traditional algorithms that they can not self-adapt signal
sparsity and have bad performance when noisy signal is recovered. It promotes the combination
of CS theory and practical application.
References
[1]
Candes E J, Romberg J, Tao T. Robust uncertainty principles: Exact signal reconstruction from
highly incomplete frequency information [J]. Information Theory, IEEE Transactions on, 2006,
52(2): 489-509.
[2]
Donoho D L. Compressed sensing [J]. Information Theory, IEEE Transactions on, 2006, 52(4):
1289-1306.
[3]
Sylvain. L Merlet, Rachid Deriche, Continuous diffusion signal, EAP and ODF estimation via
Compressive Sensing in diffusion MRI, Medical Image Analysis, 2013, 17(5): 556-572.
[4]
Mat Noor N R, Vladimirova T. Investigation into lossless hyperspectral image compression for
satellite remote sensing [J]. International Journal of Remote Sensing, 2013, 34(14): 5072-5104.
7084
G. Sun et al. /Journal of Computational Information Systems 10: 16 (2014) 7077–7084
[5]
Mallat S and Zhang Z.Matching pursuits withtime-frequency dictionaries. IEEE Transactions on
SignalProcessing, 1993, 41(12): 3397-3415.
[6]
Tropp J, Gilbert A., Signal recovery from random measurements via orthogonal matching pursuit,
Transactions on Information Theory, 2007, 53(12): 4655-4666.
[7]
Needell D, Vershynin R. Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit [J]. Foundations of computational mathematics, 2009, 9(3): 317-334.
[8]
W. Dai, O. Milenkovic. Subspace pursuit for compressivesensing signal reconstruction [C]. 2008
5th InternationalSymposium on Turbo Codes and Related Topics, 2008. 402-407.
[9]
D. Needell and J. A. Tropp, Cosamp: Iterative signal recovery from incomplete and inaccurate
samples, Applied and Computational Harmonic Analysis, vol. 26, no. 3, pp. 301-321, 2009.
[10] Thong T. Doy, Lu Ganz, Nam Nguyeny, TracD.Tran, Sparsity adaptive matching pursuit algorithm
for practical compressed sensing, Signals, Systems and Computers, 2008 42nd Asilomar Conference,
Pacific Grove, CA, 26-29 Oct. 2008: 581-587.
[11] Blumensath T, Davies M E. Iterative thresholding for sparse approximations [J]. Journal of Fourier
Analysis and Applications, 2008, 14(5-6): 629-654.
[12] Blumensath T, Davies M E. Normalized iterative hard thresholding: Guaranteed stability and
performance [J]. Selected Topics in Signal Processing, IEEE Journal of, 2010, 4(2): 298-309.
[13] Candes E, Romberg J. Sparsity and incoherence in compressive sampling [J]. Inverse problems,
2007, 23(3): 969.
[14] SUN G, ZUO J, ZHANG Y, et al. Research on Sparse Basement of Compressed Sensing in Largescale Wireless Sensor Networks [J]. Journal of Computational Information Systems, 2011, 7(12):
4185-4192.