11.07.2015 Views

A Tutorial on Support Vector Machines for Pattern Recognition

A Tutorial on Support Vector Machines for Pattern Recognition

A Tutorial on Support Vector Machines for Pattern Recognition

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

43Edgar Osuna and Federico Girosi. Reducing the run-time complexity of support vector machines. InInternati<strong>on</strong>al C<strong>on</strong>ference <strong>on</strong> <strong>Pattern</strong> Recogniti<strong>on</strong> (submitted), 1998.William H. Press, Brain P. Flannery, SaulA.Teukolsky, and William T. Vettering. Numerical recipes inC: the art of scientic computing. Cambridge University Press, 2nd editi<strong>on</strong>, 1992.M. Schmidt. Identifying speaker with support vector networks. In Interface '96 Proceedings, Sydney, 1996.B. Scholkopf. <strong>Support</strong> <strong>Vector</strong> Learning. R. Oldenbourg Verlag, Munich, 1997.B. Scholkopf, C. Burges, and V. Vapnik. Extracting support data <strong>for</strong> a given task. In U. M. Fayyad andR. Uthurusamy, editors, Proceedings, First Internati<strong>on</strong>al C<strong>on</strong>ference <strong>on</strong> Knowledge Discovery & DataMining. AAAI Press, Menlo Park, CA, 1995.B. Scholkopf, C. Burges, and V. Vapnik. Incorporating invariances in support vector learning machines. InC. v<strong>on</strong> der Malsburg, W. v<strong>on</strong> Seelen, J. C. Vorbruggen, and B. Sendho, editors, Articial Neural Networks| ICANN'96, pages 47 { 52, Berlin, 1996. Springer Lecture Notes in Computer Science, Vol. 1112.B. Scholkopf, P. Simard, A. Smola, and V. Vapnik. Prior knowledge in support vector kernels. In M. Jordan,M. Kearns, and S. Solla, editors, Advances in Neural In<strong>for</strong>mati<strong>on</strong> Processing Systems 10, Cambridge, MA,1998. MIT Press. In press.B. Scholkopf, A. Smola, and K.-R. Muller. N<strong>on</strong>linear comp<strong>on</strong>ent analysis as a kernel eigenvalue problem.Neural Computati<strong>on</strong>, 1998. In press.B. Scholkopf, A. Smola, K.-R. Muller, C.J.C. Burges, and V. Vapnik. <strong>Support</strong> vector methods in learningand feature extracti<strong>on</strong>. In Ninth Australian C<strong>on</strong>gress <strong>on</strong> Neural Networks (to appear), 1998.B. Scholkopf, K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio, and V. Vapnik. Comparing supportvector machines with gaussian kernels to radial basis functi<strong>on</strong> classiers. IEEE Trans. Sign. Processing,45:2758 { 2765, 1997.John Shawe-Taylor, Peter L. Bartlett, Robert C. Williams<strong>on</strong>, and Martin Anth<strong>on</strong>y. A framework <strong>for</strong>structural risk minimizati<strong>on</strong>. In Proceedings, 9th Annual C<strong>on</strong>ference <strong>on</strong> Computati<strong>on</strong>al Learning Theory,pages 68{76, 1996.John Shawe-Taylor, Peter L. Bartlett, Robert C. Williams<strong>on</strong>, and Martin Anth<strong>on</strong>y. Structural risk minimizati<strong>on</strong>over data-dependent hierarchies. Technical report, NeuroCOLT Technical Report NC-TR-96-053,1996.A. Smola and B. Scholkopf. Onakernel-based method <strong>for</strong> pattern recogniti<strong>on</strong>, regressi<strong>on</strong>, approximati<strong>on</strong>and operator inversi<strong>on</strong>. Algorithmica (to appear), 1998.A. Smola, B. Scholkopf, and K.-R. Muller. General cost functi<strong>on</strong>s <strong>for</strong> support vector regressi<strong>on</strong>. In NinthAustralian C<strong>on</strong>gress <strong>on</strong> Neural Networks (to appear), 1998.Alex J. Smola, Bernhard Scholkopf, and Klaus-Robert Muller. The c<strong>on</strong>necti<strong>on</strong> between regularizati<strong>on</strong>operators and support vector kernels. Neural Networks (to appear), 1998.M. O. Stits<strong>on</strong>, A. Gammerman, V. Vapnik, V.Vovk, C. Watkins, and J. West<strong>on</strong>. <strong>Support</strong> vector anovadecompositi<strong>on</strong>. Technical report, Royal Holloway College, Report number CSD-TR-97-22, 1997.Gilbert Strang. Introducti<strong>on</strong> to Applied Mathematics. Wellesley-Cambridge Press, 1986.R. J. Vanderbei. Interior point methods : Algorithms and <strong>for</strong>mulati<strong>on</strong>s. ORSA J. Computing, 6(1):32{34,1994.R.J. Vanderbei. LOQO: An interior point code <strong>for</strong> quadratic programming. Technical report, Program inStatistics & Operati<strong>on</strong>s Research, Princet<strong>on</strong> University, 1994.V. Vapnik. Estimati<strong>on</strong> of Dependences Based <strong>on</strong> Empirical Data [in Russian]. Nauka, Moscow, 1979.(English translati<strong>on</strong>: Springer Verlag, New York, 1982).V. Vapnik. The Nature of Statistical Learning Theory. Springer-Verlag, New York, 1995.V. Vapnik. Statistical Learning Theory. John Wiley and S<strong>on</strong>s, Inc., New York, in preparati<strong>on</strong>.V. Vapnik, S. Golowich, and A. Smola. <strong>Support</strong> vector method <strong>for</strong> functi<strong>on</strong> approximati<strong>on</strong>, regressi<strong>on</strong>estimati<strong>on</strong>, and signal processing. Advances in Neural In<strong>for</strong>mati<strong>on</strong> Processing Systems, 9:281{287, 1996.Grace Wahba. <strong>Support</strong> vector machines, reproducing kernel hilbert spaces and the gacv. In Proceedings ofthe 1997 NIPS Workshop <strong>on</strong> <strong>Support</strong> <strong>Vector</strong> <strong>Machines</strong> (to appear), 1998.J. West<strong>on</strong>, A. Gammerman, M. O. Stits<strong>on</strong>, V. Vapnik, V.Vovk, and C. Watkins. Density estimati<strong>on</strong> usingsupport vector machines. Technical report, Royal Holloway College, Report number CSD-TR-97-23, 1997.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!