Views
5 years ago

Acceleration of Iterative Image Restoration Algorithms

Acceleration of Iterative Image Restoration Algorithms

the extent to which the

the extent to which the acceleration method can be applied in practice. The use of technique II should allow the use of iterative image restoration algorithms in applications where they were previously considered too slow. As mentioned previously, there is considerable scope for achieving higher levels of acceleration when more information is used in the acceleration process. The new acceleration method could also be used as a generic acceleration method when iterative algorithms are designed. This would allow the algorithm to be designed for producing the best result rather than having to compromise for speed considerations. 4 Conclusion Traditional acceleration methods using a gradient search approach have been presented but require careful use to prevent instability during restoration. A new method, called automatic acceleration, has been proposed and shown to achieve signi cantly increased acceleration over previous methods with little computational overhead. An example R{L restoration achieves an average speedup of 40 times after 250 iterations, and an ME method by 20 times after only 50 iterations. Acceleration is achieved without the need for a cost{function to be determined or minimized. The new method is stable, and an estimated acceleration factor has been derived and con rmed by experiment. The acceleration technique has been successfully applied to Richardson{Lucy, maximum entropy and Gerchberg{Saxton restoration algorithms and can be integrated with other iterative techniques. The technique shows considerable promise in accelerating a wide range of iterative image restoration algorithms, which should allow them to be used in applications where previously they would have been considered impractical. 5 Acknowledgments The authors thank Russell Smith of The University of Auckland and Richard Lane of Canterbury University for their contributions to this paper. The reviewers are also to be acknowledged for their useful comments and suggestions. D.S.C. Biggs also wishes to acknowledge the support of The University of Auckland and the Royal Society of New Zealand through the R.H.T. Bates Scholarship. References [1] T. J. Cornwell. Where have we been, where are we now, where are we going? In R. J. Hanisch and R. L. White, editors, The Restoration of HST Images and Spectra II. Space Telescope Science Institute, Baltimore, Md. 1994. [2] William H. Richardson. Bayesian-based iterative method of image restoration. Journal of the Optical Society of America, Vol. 62(No. 1):55{59, January 1972. [3] L. B. Lucy. An iterative technique for the recti cation of observed images. The Astronomical Journal, Vol. 79(No. 6):745{754, June 1974. [4] V. Vardi, L. A. Shepp, and L. Kaufman. A statistical model for positron emission tomography. Journal of the American Statistical Association, Vol. 80(No. 389):8{37, March 1985. [5] Richard L. White. Image restoration using the damped Richardson{Lucy method. In R. J. Hanisch and R. L. White, editors, The Restoration of HST Images and Spectra II. Space Telescope Science Institute, Baltimore, Md. 1994. [6] Edward S. Meinel. Origins of linear and non-linear recursive restoration algorithms. Journal of the Optical Society of America A,Vol. 3(No. 6):787{799, June 1986. [7] H.M. Adorf, R.N. Hook, L.B. Lucy, and F.D. Murtagh. Accelerating the Richardson{Lucy restoration algorithm. In 4th ESO/ST-ECF Data Analysis Workshop, pages 99{103. European Southern Observatory, Garching, Germany, 1992. [8] Timothy J. Holmes and Yi-Hwa Liu. Acceleration of maximum-likelihood image restoration for uorescence microscopy and other noncoherent imagery. Journal of the Optical Society of America A,Vol. 8(No. 6):893{907, June 1991. [9] Linda Kaufman. Implementing and accelerating the EM algorithm for positron emission tomography. IEEE Transactions on Medical Imaging, Vol. 6(No. 1):37{51, March 1987. [10] D. S. C. Biggs and M. Andrews. Conjugate gradient acceleration of maximum-likelihood image restoration. Electronics Letters, Vol. 31(No. 23):1985{1986, 9th Nov 1995. [11] R. G. Lane. Methods for maximum likelihood deconvolution. Journal of the Optical Society of America A,Vol. 13:1992{ 1998, 1996. [12] W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery. Numerical Recipies in C. Cambridge University Press, 2nd edition, 1992. [13] J. R. Fienup. Phase retrieval algorithms: a comparison. Applied Optics, Vol. 21(No. 15):2758{2769, 1 August 1982. [14] Philip E. Gill, Walter Murray, and Margaret H. Wright. Practical Optimization. Academic, New York, 1981. [15] N.L. Bonavito, J.E. Dorband, and T. Busse. Maximum entropy restoration of blurred and oversaturated Hubble Space Telescope imagery. Applied Optics, Vol. 32, No. 29:5768{5774, 10 October 1993. Appendix A: Acceleration Parameter for Technique II We consider the iterative algorithm ( ), which generates a new estimate (xk+1) given the current vector (xk): xk+1 = (xk) = xk + gk; and xk slowly converges towards a solution at xs. The change gk introduced at iteration k can be considered a step in the gradient ascent direction toward the solution. The function to be maximized can be approximated locally by amultidimensional quadratic: m(x) = ,(x , xs) > A(x , xs); rm(x) = ,2A(x , xs); where A is a square, symmetric, and positive de nite matrix. Taking a step in the gradient direction rm(x) for each iteration produces a path through the multidimensional space towards the solution. This path can be described by the solution to the di erence equation in terms of the eigenvalues ( A) and eigenvectors (S) ofA. gk = rm(xk) = ,2 A(xk , xs); A = S AS ,1 ; B = I , 2 A; B = I , 2 A; xk = S k B S,1 (x0 , xs) + xs; < 1 2 max( A) ; APPLIED OPTICS / Vol. 36, No. 8 / 10 March 1997 / pp. 1766{1775 / Page 8

xs is the vector xk converges. A and B are diagonal matricies, and is a scalar used to prevent B from becoming negative. We have no knowledge of the form of the quadratic or its eigenvalues or eigenvectors. The aim of acceleration is to move along the path (or close to it) faster than the iterative algorithm. If this is possible, then the accelerated method should result in the same solution given that the same path was taken. Using the current and the previous estimates and associated gradient information, the aim is to predict a new estimate further along the path. xk = S k B S,1 (x0 , xs) + xs; gk,1 = S k B(I , ,1 B )S,1 (x0 , xs); xk, = S k, B S,1 (x0 , xs) + xs; gk, ,1 = S k, B (I , ,1 B )S,1 (x0 , xs): The di erence between the current (xk) and previous estimate (xk, )is hk;, = xk , xk, = S k B (I , , B )S,1 (x0 , xs): Given estimates at xk and xk, the aim is to predict xk+ : xk+ = S k+ B S,1 (x0 , xs) + xs; gk+ ,1 = S k+ B hk; = xk+ , xk = S k+ B (I , ,1 B )S,1 (x0 , xs); (I , , B )S,1 (x0 , xs): Using a rst{order extrapolation we predict hk; by a least{squares projection of khk;, , where k is a scalar: ^hk; = khk;, ; 0 = (hk; , khk;, ) > hk;, ; k = hk; > hk;, hk;, > hk;, We do not have knowledge of hk; but we do have the gradient information: hk; = xk+ , xk = S k+ B I , , B (I , ,1 B ); : (I , , B )S,1 (x0 , xs); hk; S k B(I , ,1 B )S,1 (x0 , xs) = gk,1; hk;, k gk, ,1; gk,1 > gk, ,1 gk, ,1 > gk, ,1 From this we can predict a new point (yk) and apply the iterative algorithm to generate the next estimate (xk+1) and gradient (gk): yk = xk + k(xk , xk,1); xk+1 = (yk) = yk + gk; k+1 = gk > gk,1 gk,1 > gk,1 yk+1 = xk+1 + k+1(xk+1 , xk): ; : This formulation of the acceleration parameter requires no knowledge of the quadratic A or its eigenvalues and eigenvectors and does not require that they be constant over the entire error surface. Appendix B: Estimating the Acceleration Factor for Technique II Being able to estimate the acceleration at each step is useful in comparing the technique with unaccelerated restoration and determining how e ective the acceleration process is. Continuing from Appendix A, it can be seen that the acceleration achieved at iteration k is related to how effectively hk; is estimated. If k is the e ciency parameter, then the step actually achieved is hk; k k : ^hk; = khk;, k ; hk; k k = khk;, k ; and therefore the e ective number of unaccelerated iterations achieved by acceleration is k k. The value of k can be estimated from all previous values of k: k,1 X k = 1+ j=1 k,1 Y m=k,j The acceleration factor, k, is given by k k plus the single step of the iterative algorithm: k = 1+ kX j=1 kY m=k,j+1 The value of k is di cult to determine exactly, though it is proposed that it can be estimated by kgk,2k ^k = k kgk,1k gk,1 > gk,2 kgk,1k kgk,2k The estimates of acceleration achieved in this paper use M = 3 [Eq. 13] which correlate well with experimental results. APPLIED OPTICS / Vol. 36, No. 8 / 10 March 1997 / pp. 1766{1775 / Page 9 m: m: M :

ITER Vacuum Pumping Systems - CERN Accelerator School
containers, iterators, and algorithms - TAMU Computer Science ...
Image Restoration & Image Segmentation
Image Processing & Video Algorithms with CUDA
Algorithms for Image Processing and Computer Vision Second Edition
CUDA accelerated X-ray Imaging - ICCS
A Watermarking Algorithm for Fingerprinting Intelligence Images
Multi-dimensional Image Analysis Algorithms in Service ... - CenSSIS
Image Registration – The Dual-Bootstrap Algorithm and ... - CenSSIS
The Gordon-CenSSIS Biomedical Imaging Acceleration Toolbox
Firm's Image Restoration Strategies - Columbus State University
Earth Imaging Journal - South Bay Salt Pond Restoration Project
Gpu-accelerated data expansion for the Marching Cubes algorithm
Accelerating Innovation (2011) - Office of Science
SQUAREM - Acceleration Schemes for Monotone Fixed-Point ...
Basic Algorithms for Digital Image Analysis: a course
Comparative Study of Some Image Segmentation Algorithms - WSEAS
development and evaluation of sar algorithms for image formation ...
Select Topics in Image Analysis: Algorithm Tuning for IHC ... - Aperio
ECG Based Algorithms for CRT Optimization
Lower kV Imaging and Noise Reduction Techniques - Research ...
15-Digital Image Inpainting.pdf - Ijarcce.com
3D Reconstruction Algorithms Houston, December 2002 - NCMI
super-resolution imaging using blur as a cue
advanced violin restoration techniques - International Specialised ...
Conducting Market Research Conducting Market Research - iMAG