Learning binary relations using weighted majority voting
Learning binary relations using weighted majority voting
Learning binary relations using weighted majority voting
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
LEARNING BINARY RELATIONS 267<br />
325000<br />
300000<br />
275000<br />
250000<br />
225000<br />
200000<br />
, , , , i , , , , i , , , , i , , , , i , , , , I<br />
2000 4000 6000 8000 i0000<br />
Figure 6. This figure provides a graphical example for the mistake bound of Theorem 3. We have plotted<br />
o~ = ap on the x-axis, and the value of our upper bound on the mistakes is on the y-axis. In this plot<br />
m = n = 1000 and k = kp = 50. Observe that there are 1,000,000 predictions made, and when there is<br />
no noise (i.e. c~ = 0) our upper bound on the mistakes is 180, 121.<br />
Learn-Relation (over the 1,000,000 predictions) --the actual number of mistakes could<br />
be lower.<br />
Before proving the theorem, we look at one special case. If partition p is such that<br />
ap = n, then the number of mistakes is at most<br />
kpm + 13mn2 lg k + 4ran 2 + mn2 2~/-~]--~.<br />
Proof of Theorem 3: From Theorem 2 we know that for all/3 C [0, 1), our algorithm<br />
makesatmostmin{kpm+~/3mn21gkp+2c~p(mn-~~)lg} }<br />
lg ~ mistakes where the mini-<br />
mum is taken over all partitions p and hp denotes the size and c~p the noise of partition<br />
p.<br />
that<br />
Assume that the partition p has the property that kp _< k and C~p _< c~. Observe<br />
(ln{)/(21nl+-~) > t for /3 ~ [0,1). Furthermore, since 2c~p(mn-c@ <<br />
2ce(mn - ce) for C~p < a < ~-~, it follows that<br />
3ran 2 lg k + 2c~p(mn - o9) lg -~<br />
ig 2<br />
I+~