01.04.2015 Views

1FfUrl0

1FfUrl0

1FfUrl0

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Dimensionality Reduction<br />

n_<br />

features_<br />

to_select support_ ranking_<br />

4<br />

[False True False True False False False False<br />

True True] [3 1 2 1 7 4 6 5 1 1]<br />

5<br />

[False True True True False False False False<br />

True True] [2 1 1 1 6 3 5 4 1 1]<br />

6<br />

[ True True True True False False False False<br />

True True] [1 1 1 1 5 2 4 3 1 1]<br />

7<br />

[ True True True True False True False False<br />

True True] [1 1 1 1 4 1 3 2 1 1]<br />

8<br />

[ True True True True False True False True<br />

True True] [1 1 1 1 3 1 2 1 1 1]<br />

9<br />

[ True True True True False True True True<br />

True True] [1 1 1 1 2 1 1 1 1 1]<br />

10<br />

[ True True True True True True True True<br />

True True] [1 1 1 1 1 1 1 1 1 1]<br />

We see that the result is very stable. Features that have been used when requesting<br />

smaller feature sets keep on getting selected when letting more features in. Finally<br />

we rely on our train/test set splitting to warn us when we go in the wrong direction.<br />

Other feature selection methods<br />

There are several other feature selection methods that you will discover while<br />

reading through machine learning literature. Some don't even look like feature<br />

selection methods as they are embedded into the learning process (not to be confused<br />

with the previously mentioned wrappers). Decision trees, for instance, have a feature<br />

selection mechanism implanted deep in their core. Other learning methods employ<br />

some kind of regularization that punishes model complexity, thus driving the<br />

learning process towards good performing models that are still "simple". They do<br />

this by decreasing the less impactful features' importance to zero and then dropping<br />

them (L1-regularization).<br />

So watch out! Often, the power of machine learning methods has to be attributed to<br />

their implanted feature selection method to a great degree.<br />

[ 232 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!