12.07.2015 Views

Background Subtraction Using Ensembles of Classifiers with an ...

Background Subtraction Using Ensembles of Classifiers with an ...

Background Subtraction Using Ensembles of Classifiers with an ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

outliers which comprises the foreground. Depending on the layer classification, the pixel is thenclassified as foreground or background.Optic flow [25] has been used for image segmentation as well. In [26], Zucchelli et al. clustermotion fields in order to recognize image pl<strong>an</strong>es. While these pl<strong>an</strong>es are not applied to foregroundsegmentation, the step is highly intuitive given the pl<strong>an</strong>es recovered. The use <strong>of</strong> motion in imagesegmentation has been gaining traction recently because <strong>of</strong> new hardware implementations thatare able to calculate flow in real-time. This represents <strong>an</strong> exciting avenue for future researchusing this additional feature.In [24], motion is used directly for background subtraction. Kernel Density Estimation [27] isused to model the distribution <strong>of</strong> the background using five dimensional feature vectors. Basedon this distribution classification is performed based on thresholded probabilities <strong>of</strong> inst<strong>an</strong>cesbelonging to the background distribution. The feature vectors consist <strong>of</strong> three dimensions fromcolor space intensities, <strong>an</strong>d two from optic flow measurements (which consists <strong>of</strong> the flow measurements<strong>an</strong>d their uncertainties). This algorithm performed extremely well when compared toMixture <strong>of</strong> Gaussi<strong>an</strong> models.2.3 Illumination ConsiderationsM<strong>an</strong>y approaches have been used for illumination invari<strong>an</strong>t background classification. Onereason for poor FG/BG segmentation in varying illumination is because most tracking systemsrely on color from RGB color space, which is highly vari<strong>an</strong>t to illumination ch<strong>an</strong>ges. Becausegrayscale color space is a linear tr<strong>an</strong>sformation from RGB the same problem exists. A commonsolution is the non-linear mapping <strong>of</strong> RGB color to <strong>an</strong>other color space that is less illuminationinvari<strong>an</strong>t.M<strong>an</strong>y color conversions that claim to be illumination invari<strong>an</strong>t have been proposed. Theoritically,the hue component <strong>of</strong> HSI color space <strong>an</strong>d the luma commponent (Y) <strong>of</strong> YCbCr colorspace are illumination invari<strong>an</strong>t, however in practice this is not typically the case (see Section3.3). In [24] Mittal <strong>an</strong>d Paragios use a color mapping <strong>of</strong> RGB → rgI (where I = (R + G + B)/3,r = 3R/I, <strong>an</strong>d g = 3G/I) due to claimed illumination invari<strong>an</strong>ce under certain conditions. Experimentationperformed in this work using this color space failed to observe such illuminationinvari<strong>an</strong>t conditions, though this is not to say they do not exist. In [28], Gevers <strong>an</strong>d Smeulders15

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!