06.08.2013 Views

Abstract

Abstract

Abstract

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

APPENDIX D. GMRES AND ARNOLDI ITERATIVE METHODS 152<br />

be the j-th Krylov iterate. So a j solves the least-squares problem<br />

min<br />

a∈a 0 r2 = Da −<br />

+Kj<br />

b2. Let {pi} j<br />

i=1 form an orthonormal basis of the Krylov space Kj. Then every vector<br />

v ∈ Kj can be written as<br />

v =<br />

j<br />

γipi.<br />

i=1<br />

If we take the vectors {pi} j<br />

i=1 and form a matrix Pj ∈ R M×j where the columns of Pj<br />

are the vectors {pi} j<br />

i=1 , then every vector v ∈ Kj can be written as<br />

v = Pjγ<br />

for some γ ∈ R j . Therefore, the j-th Krylov iteration satifies<br />

min<br />

a∈a 0 Da −<br />

+Kj<br />

b2 =min<br />

γ∈R j D(a0 + Pjγ) −b2 =min<br />

γ∈R j DPjγ − r 0 2.<br />

This means to compute the j-th Krylov iteration, we can solve an j-dimensional linear<br />

least squares problem for the coefficients γ. This j-dimensional linear least-squares<br />

problem can be solved with a QR factorization or an SVD [28], but to do so we need<br />

to compute the matrix Pj. Now,thecolumnsofPj can be computed by applying the<br />

Gram-Schmidt procedure to the set {r 0 ,Dr 0 ,D 2 r 0 ,...,D j−1 r 0 }. This computational<br />

algorithm for determining Pj is known as the Arnoldi procedure [19]. The Arnoldi

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!