Real-time feature extraction from video stream data for stream ...

ai.cs.uni.dortmund.de

Real-time feature extraction from video stream data for stream ...

7.2. Segmentation and Tagging of News shows

feature name weight (ranking) weight (ranking)

in run1

in run2

frame:b and w:CenterOfMass:normalizedY 1.0 (1) 1.0 (1)

frame:red:standardDeviation 0.830 (2) 0.801 (3)

frame:blue:median 0.804 (3) -

frame:green:average 0.789 (4) -

frame:red:average 0.764 (5) -

frame:blue:average - 0.837 (2)

frame:green:CenterOfMass:normalizedX - 0.729 (4)

frame:green:average - 0.717 (5)

Table 7.4.: Normalized feature weights of a linear SVM for distinguishing between anchorshots

and news report shots. The weights are calculated by using the Weights

by SVM operator, which is included in RapidMiner. The calculation is based on two

different randomly drawn stratified data sets, each including 10.000 examples.

Obviously, the optimal feature subset

varies from one run to the other. This difference

in the top five ranked features is

based on the sampling of the input data.

In order to have reliable results, we are

interested in a more stable feature selection.

Thus we have to find those features

that produce good results on different splits

of the data. Hence, after reading the labeled

data in, we split the data into different

sets, according to the split of an

X-validation. As we will need another X-

validation operator, this X-validation will

be referred to as the outer X-validation. On

each of the training data sets produced by

the outer X-validation, we select the optimal

features by using an evolutionary algorithm.

This evolutionary algorithms again

uses an (inner) X-validation to choose the

selected features. Based on the optimal feature

subset, a model is built and evaluated

for each split of the original data. At

the end, the outer X-validation computes,

which features have been selected for most

splits of the data. The corresponding process

is shown in figure 7.5.

Figure 7.5.: RapidMiner process for the feature

selection using Naive Bayes.

77

More magazines by this user
Similar magazines