Bagging, Boosting and Ransac - LASA
Bagging, Boosting and Ransac - LASA
Bagging, Boosting and Ransac - LASA
- No tags were found...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
MACHINE LEARNING - 201310Relaxing the assumptions• We DROP the second assumption• Y (1) ,...,Y (m) are iid• E(Y) = y (E(Y) is an unbiased estimator of y)E((Y y) 2 )=E((Y E(Y )+E(Y ) y) 2 )=E(((Y E(Y )+(E(Y ) y)) 2 )= E((Y E(Y )) 2 )+E((E(Y ) y) 2 )+E(2(Y E(Y ))(E(Y ) y))σ 2 (Y ) ≥ 0The larger it is,the better for uswe add theseE (Y y) 2 E (E(Y ) y) 2E (Y y) 2 E (Z y) 2using Z gives us a smaller error(even if we can’t prove convergence to zero)Zwe regroup themE Y E(Y ) E 2(E(Y ) y)= 0