Low_resolution_Thesis_CDD_221009_public - Visual Optics and ...
Low_resolution_Thesis_CDD_221009_public - Visual Optics and ...
Low_resolution_Thesis_CDD_221009_public - Visual Optics and ...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
CHAPTER 2<br />
aberration estimation from the corresponding excel file stored during the<br />
measurement, <strong>and</strong> saves the results in a new excel file in the corresponding project<br />
folder.<br />
Fig. 2. 27. Detail of the centroid processing. (a) Raw Image corresponding to a<br />
recorded aerial image. (b) Processed image: background subtraction, filtering <strong>and</strong><br />
thresholding. Botton row: zoomed images. The centroids are computed as an intensity<br />
mass center (green cross). (From Sergio Barbero’s <strong>Thesis</strong> (Barbero, 2004)).<br />
2.3.2.6. Pupil images (passive eye- <strong>and</strong> lens-tracking)<br />
The system, saving pupil images during the measurement <strong>and</strong> synchronized with the<br />
retinal images allows for a new functionality in LRT measurements: passive eyetracking.<br />
The edges of the pupil can be detected to estimate the exact pupil position at<br />
each frame. Then, the corrected entrance position of the ray can be considered in the<br />
wave aberration retrieval algorithms to retrieve a more accurate wave aberration<br />
measurement.<br />
Fig. 2. 28 shows two examples of corrected entrance positions of a patient with<br />
important eye movements (a) <strong>and</strong> a patient with normal eye movements (b). The<br />
hardware needed was set-up <strong>and</strong> the acquisition software developed during this thesis.<br />
First, a preliminary version was tested in LRT1 <strong>and</strong> then a fully functional <strong>and</strong><br />
synchonized version was incorporated in LRT2. Pupil images served to reject retinal<br />
images (subject to eye motion artefacts or occlusion). Further developments of the<br />
pupil image analysis in the LRT2 system by Lourdes Llorente, have allowed automatic<br />
detection of the pupil edge, <strong>and</strong> therefore to compensate for potential eye movements<br />
in the measurements (Llorente, 2009, Llorente et al., 2007).<br />
The stored pupillary images also helped in the analysis of LRT measurements<br />
with contact lenses, as they allowed to estimate the lens position, relative to the pupil<br />
position, at each frame, as will be described in Chapter 9.<br />
2.3.2.7. System Calibration<br />
In this section the calibration procedure of the different elements of the system, <strong>and</strong><br />
the validation of the measurements provided will be described. Additional descriptions<br />
can be found in Lourdes Llorente’s thesis (Llorente, 2009).<br />
88