18.11.2014 Views

Pit Pattern Classification in Colonoscopy using Wavelets - WaveLab

Pit Pattern Classification in Colonoscopy using Wavelets - WaveLab

Pit Pattern Classification in Colonoscopy using Wavelets - WaveLab

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

4.3 Structure-based classification<br />

(unique node value <strong>in</strong> this context), which is not present among the other feature vectors (denoted<br />

by v, w, x and y <strong>in</strong> the colored boxes). S<strong>in</strong>ce the feature vectors are not compareable <strong>in</strong><br />

this form, a zero is <strong>in</strong>serted at each position, where a feature vector is miss<strong>in</strong>g a unique node<br />

value which is present <strong>in</strong> another feature vector. The result<strong>in</strong>g feature vectors are shown<br />

<strong>in</strong> figure 4.4(b). Now, the unique node values are located at the same positions among all<br />

feature vectors and the feature vectors can be compared by some distance metric.<br />

When us<strong>in</strong>g this feature extractor along with the k-NN classifier, the distance metric used<br />

to calculate the distances between the feature vectors is the euclidean distance metric already<br />

presented <strong>in</strong> equation (3.5).<br />

For example the distance between the feature vectors for images 1 and 2 presented <strong>in</strong> figure<br />

4.4(b) along with euclidean distance metric would then result <strong>in</strong> the follow<strong>in</strong>g distance<br />

D(f 1 , f 2 ) = v 2 + w 2 (4.56)<br />

where f 1 and f 2 are the first two feature vectors from figure 4.4(b).<br />

Tree structure (FETS) This feature extractor is very similar to the feature extractor<br />

presented <strong>in</strong> the previous paragraph s<strong>in</strong>ce it is based on the unique node values too. But<br />

<strong>in</strong>stead of limit<strong>in</strong>g the number of subbands to use for feature vector generation, this feature<br />

extractor always uses the complete decomposition tree for the process of feature vector<br />

creation. A feature vector for a decomposition tree T i can be written as<br />

F i = {U 1 , U 2 , . . . , U n } (4.57)<br />

where U j denotes the unique node value of the j-th leaf node of T i and n is the total number<br />

of leaf nodes <strong>in</strong> T i .<br />

S<strong>in</strong>ce this feature extractor uses all subbands of a decomposition tree to construct the<br />

feature vector from the dom<strong>in</strong>ance tree extension is not needed. But due to the nature of the<br />

best-basis algorithm it is also possible, like it was the case for the FEUNV feature extractor,<br />

that some feature vector conta<strong>in</strong>s a unique node value, which is not present among the other<br />

feature vectors. Thus, aga<strong>in</strong> the process illustrated <strong>in</strong> figure 4.4 is applied to the feature<br />

vectors before they are passed <strong>in</strong>to any classifier.<br />

Aga<strong>in</strong>, if this feature extractor is used along with the k-NN classifier, the distance metric<br />

is the euclidean distance metric from equation (3.5).<br />

Tree distances (KNNTD) The last structural feature extractor we developed, uses the<br />

quadtree structures result<strong>in</strong>g from the best-basis algorithm directly to compute the k nearest<br />

neighbours <strong>in</strong> the k-NN algorithm. Thus we search for the k nearest neighbours with respect<br />

to the decomposition trees which can be regarded here as feature vectors, along with a<br />

quadtree distance metric. As quadtree distance metric we use the “distance based on unique<br />

node values” here (see section 4.3.2)<br />

53

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!