Speaker : | Anastasios Giovanidis |
LIP6 / CNRS | |
Date: | 27/05/2020 |
Time: | 11:00 am - 12:00 pm |
Location: | Paris-Rennes Room (EIT Digital) |
Abstract
Cortes, C., Vapnik, V. Support-vector networks. Mach Learn 20, 273–297 (1995). https://doi.org/10.1007/BF00994018 .
Abstract of the paper:
“The support-vector network is a new leaming machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high- dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.”
Talk material available here: https://github.com/yokaiAG/DataNets-Course