20.05.2014 Views

link to my thesis

link to my thesis

link to my thesis

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

82 CHAPTER 6. ITERATIVE EIGENVALUE SOLVERS<br />

much sparser commuting matrices A x1 ,...,A xn in the inner loop, causing a speed<br />

up in computation time and a decrease in required floating point operations. In this<br />

section also the pseudocode of the JDCOMM method is given. Numerical experiments<br />

using the approaches described in this chapter are given in Chapter 7 where the<br />

performance of the various approaches is compared <strong>to</strong> that of conventional iterative<br />

methods and of other methods for multivariate polynomial optimization.<br />

6.1 Iterative eigenvalue solvers<br />

In the previous chapters it was discussed how the global minimum of a Minkowski<br />

dominated polynomial p λ can be found by solving an eigenvalue problem. Unless<br />

the dimensions are small, this eigenvalue problem is preferably handled with iterative<br />

eigenvalue solvers because these allow for computations in a matrix-free fashion using<br />

an nD-system as described in Chapter 5. The matrix-free approach implements the<br />

action Ax of the matrix A on a given vec<strong>to</strong>r x without the explicit construction of<br />

the matrix A. This action is carried out repeatedly <strong>to</strong> approximate the eigenvec<strong>to</strong>rs<br />

and eigenvalues of the matrix A.<br />

A well known iterative procedure for finding the largest eigenvalue / eigenvec<strong>to</strong>r<br />

pair is the power method [72]. This method operates as follows: let v 1 ,v 2 ,...,v N be<br />

the N eigenvec<strong>to</strong>rs of an N × N matrix A, assuming that N independent eigenvec<strong>to</strong>rs<br />

v i exist. Let λ 1 ,λ 2 ,...,λ N be the corresponding eigenvalues such that |λ 1 | ><br />

|λ 2 |≥... ≥|λ N |. Then each x ∈ R N can be written as a linear combination of the<br />

eigenvec<strong>to</strong>rs v i : x = α 1 v 1 + α 2 v 2 + ...+ α N v N . The action of the matrix A on the<br />

vec<strong>to</strong>r x yields: Ax = α 1 λ 1 v 1 +α 1 λ 2 v 1 +...+α N λ N v N . More generally it holds that:<br />

A k x = α 1 λ k 1v 1 + α 2 λ k 2v 2 + ...+ α N λ k N v N ≈ α 1 λ k 1v 1 when k is large enough, because<br />

the eigenvalue λ 1 is dominant. Therefore, this method (extended with a renormalization<br />

procedure) allows for the iterative approximate calculation of the eigenvalue<br />

with the largest magnitude and its accompanying eigenvec<strong>to</strong>r.<br />

The inverse power method is an iterative procedure which allows <strong>to</strong> compute the<br />

eigenvalue with the smallest magnitude [72]. This method operates with the matrix<br />

A −1 rather than A. Note that the matrices A and A −1 have the same eigenvec<strong>to</strong>rs<br />

since A −1 x =(1/λ)x follows from Ax = λx. Since the eigenvalues of the matrix<br />

A −1 are 1/λ 1 ,...,1/λ N , the inverse power method converges <strong>to</strong> the eigenvec<strong>to</strong>r v n<br />

associated with the eigenvalue 1/λ N : iteratively computing the action of A −1 on a<br />

vec<strong>to</strong>r x = α 1 v 1 + α 2 v 2 + ...+ α N w N , where v i are the eigenvec<strong>to</strong>rs of the matrix,<br />

yields: (A −1 ) k x = α 1 (1/λ 1 ) k v 1 +α 2 (1/λ 2 ) k v 2 +...+α N (1/λ N ) k v N ≈ α N (1/λ N ) k v N<br />

when k is large enough.<br />

Other, more advanced, iterative eigenvalue solvers are Arnoldi, Lanczos and Davidson<br />

methods [30] . Davidson’s method is efficient for computing a few of the smallest<br />

eigenvalues of a real symmetric matrix. The methods by Lanczos [73] and Arnoldi<br />

[4],[77] in their original form, tend <strong>to</strong> exhibit slow convergence and experience difficulties<br />

in finding the so-called interior eigenvalues. Interior eigenvalues are eigenvalues

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!