12.07.2015 Views

Weakly supervised classification of objects in images using soft ...

Weakly supervised classification of objects in images using soft ...

Weakly supervised classification of objects in images using soft ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

6 Riwal Lefort, Ronan Fablet, Jean-Marc Boucherfall <strong>in</strong> the test set T t m . Sett<strong>in</strong>g β obeys to a trade-<strong>of</strong>f: for a good assessment <strong>of</strong>random forest C m , the number <strong>of</strong> samples <strong>in</strong> T r m must be high enough. But ifβ is too high, only very few samples will be updated at each iteration lead<strong>in</strong>g toa very slow convergence <strong>of</strong> the algorithm. In practice β is typically set to 0.75.The algorithm is shown <strong>in</strong> the table 2. In the subsequent, this procedure willbe denoted as IP2 (Iterative Procedure 2).Given a tra<strong>in</strong><strong>in</strong>g data set T 1 = {x n, π 1 n} and M iterations,1. for m from 1 to M– Randomly split T m <strong>in</strong> two groups: T r m = {x n, πn m } and T t m = {x n, πn m }accord<strong>in</strong>g to a split proportion β.– Learn a classifier C m from subset T r m.– Apply classifier C m to subset T t m.– Update T t m+1 = {x n, πn m+1 } with πnm+1 ∝ πnp(x 1 n|y n = i, C m).– Update tra<strong>in</strong><strong>in</strong>g dataset T m+1 as T t m+1: T m+1 = {T r m, T t m+1}.2. Learn the f<strong>in</strong>al classifier us<strong>in</strong>g T M+1.Table 2. Randomization-based iterative procedure for weakly <strong>supervised</strong> learn<strong>in</strong>g (IP2).4 Application to semi-<strong>supervised</strong> learn<strong>in</strong>gAs a specific case <strong>of</strong> <strong>in</strong>terest <strong>of</strong> the proposed weakly <strong>supervised</strong> strategy, we consideran application to semi-<strong>supervised</strong> learn<strong>in</strong>g. We first briefly review exist<strong>in</strong>gapproaches and then detail our contribution.4.1 Related workSemi-Supervised Learn<strong>in</strong>g is reviewed <strong>in</strong> [5]. Four types <strong>of</strong> methods can be dist<strong>in</strong>guished.The first type <strong>in</strong>cludes generative models <strong>of</strong>ten exploit<strong>in</strong>g ExpectationMaximization schemes that assess parameters <strong>of</strong> mono-modal Gaussianmodels [31] [5] or multi-modal Gaussian models [32]. Their advantages are theconsistency <strong>of</strong> the mathematical framework with the probabilistic sett<strong>in</strong>g. Thesecond category refers to discrim<strong>in</strong>ative models such as the semi-<strong>supervised</strong> supportvector mach<strong>in</strong>e (S3VM) [33] [5]. Despite a mathematically-sound basis andgood performances, S3VM are subject to local optimization issues and S3VMcan be outperformed by other models depend<strong>in</strong>g on the dataset. Graph-basedclassifier is an other well known category <strong>in</strong> semi-<strong>supervised</strong> learn<strong>in</strong>g [34] [5].The approach is close to the K-nearest-neighbour approach but similarities betweenexamples are also taken <strong>in</strong> account. The pr<strong>in</strong>cipal drawback is that thisclassifier is mostly transductive: generalization properties are rather weak andperformances decrease with unobserved data. The last family <strong>of</strong> semi-<strong>supervised</strong>models is formed by iterative schemes such as the self-tra<strong>in</strong><strong>in</strong>g approach [35]or the co-tra<strong>in</strong><strong>in</strong>g approach [36] that is applicable if observation features canbe split <strong>in</strong>to two <strong>in</strong>dependent groups. The advantage is the good performancereached by these methods and the simplicity <strong>of</strong> the approach. Their drawbacksmostly lie <strong>in</strong> the difficulties to characterize convergence properties.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!