13.07.2015 Views

Data Mining: Practical Machine Learning Tools and ... - LIDeCC

Data Mining: Practical Machine Learning Tools and ... - LIDeCC

Data Mining: Practical Machine Learning Tools and ... - LIDeCC

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

4.6 LINEAR MODELS 125Set all weights to zeroUntil all instances in the training data are classified correctlyFor each instance I in the training dataIf I is classified incorrectly by the perceptronIf I belongs to the first class add it to the weight vectorelse subtract it from the weight vector(a)w 0 w 1 w 2w k1(“bias”)attributea 1attributea 2attributea 3(b)Figure 4.10 The perceptron: (a) learning rule <strong>and</strong> (b) representation as a neural network.means that we don’t have to include an additional constant element in the sum.If the sum is greater than zero, we will predict the first class; otherwise, we willpredict the second class. We want to find values for the weights so that the trainingdata is correctly classified by the hyperplane.Figure 4.10(a) gives the perceptron learning rule for finding a separatinghyperplane. The algorithm iterates until a perfect solution has been found, butit will only work properly if a separating hyperplane exists, that is, if the data islinearly separable. Each iteration goes through all the training instances. If amisclassified instance is encountered, the parameters of the hyperplane arechanged so that the misclassified instance moves closer to the hyperplane ormaybe even across the hyperplane onto the correct side. If the instance belongsto the first class, this is done by adding its attribute values to the weight vector;otherwise, they are subtracted from it.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!