11.07.2015 Views

Principles of Modern Radar - Volume 2 1891121537

Principles of Modern Radar - Volume 2 1891121537

Principles of Modern Radar - Volume 2 1891121537

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

174 CHAPTER 5 <strong>Radar</strong> Applications <strong>of</strong> Sparse Reconstruction= argminxg(x) + P (2 ∥ x − b − 1 )∥ ∥∥∥2P ∇ f (b) 2λ ‖x‖ 1 + P 2 ∥(b x − − 2 )∥ ∥∥∥2P AH (Ab − y)2)= argminx(= η s b − 2 P AH (Ab − y), λ P(5.31)(5.32)Notice that the term in parenthesis in (5.31) is a constant for fixed b. Thus, the minimizationin (5.31) can be carried out component-wise, yielding a simple analytical solutionthat corresponds to the application <strong>of</strong> the s<strong>of</strong>t threshold. Combining these ideas, weobtain ISTA(ˆx k+1 = η s ˆx k − 2 P AH (A ˆx k − y), λ )PUnfortunately, ISTA has been shown to enjoy only a sublinear, that is proportional to1/k, rate <strong>of</strong> convergence [79, Theorem 3.1]. Beck and Teboulle [79] propose a modifiedfast ISTA (FISTA) that uses a very simple modification to obtain a quadratic convergencerate. FISTA requires nearly identical computational cost, particularly for a large-scaleproblem, and is given byz 1 = ˆx 0 = 0t 1 = 1 (ˆx k = η s z k − 2 P AH (Az k − y), λ )Pt k+1 = 1 + √ 1 + 4(t k ) 22z k+1 = ˆx k + t k − 1t ( k+1 ˆxk − ˆx k−1 )Intuitively, this algorithm uses knowledge <strong>of</strong> the previous two iterates to take faster stepstoward the global minimum. No additional applications <strong>of</strong> A and A H are required comparedwith ISTA, and thus the computational cost <strong>of</strong> this modification is negligible for thelarge-scale problems <strong>of</strong> interest.ISTA and FISTA 29 converge to true solutions <strong>of</strong> QP λ at sublinear and quadratic convergencerates, respectively. Thus, these algorithms inherit the RIP-based performanceguarantees already proven for minimization <strong>of</strong> this cost function, for example (5.15). Indeed,these algorithms are close cousins <strong>of</strong> the Kragh algorithm described in the previoussection. The key difference is that the derivation is restricted to the p = 1 norm case totake advantage <strong>of</strong> a simple analytical result for the minimization step <strong>of</strong> the algorithm.As a final note on these algorithms, in [61] the authors leverage the same previous workthat inspired FISTA to derive the NESTA algorithm. This approach provides extremelyfast computation, particularly when the forward operator A enjoys certain properties. Inaddition, the provided algorithm can solve more general problems, including minimization29 We should mention that the FISTA algorithm given in [79] is more general than the result providedhere, which has been specialized for our problem <strong>of</strong> interest.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!