Improved Classification Rates for Localized Algorithms under Margin Conditions e-bog
436,85 DKK
(inkl. moms 546,06 DKK)
Support vector machines (SVMs) are one of the most successful algorithms on small and medium-sized data sets, but on large-scale data sets their training and predictions become computationally infeasible. The author considers a spatially defined data chunking method for large-scale learning problems, leading to so-called localized SVMs, and implements an in-depth mathematical analysis with theo...
E-bog
436,85 DKK
Forlag
Springer Spektrum
Udgivet
18 marts 2020
Genrer
Probability and statistics
Sprog
English
Format
pdf
Beskyttelse
LCP
ISBN
9783658295912
Support vector machines (SVMs) are one of the most successful algorithms on small and medium-sized data sets, but on large-scale data sets their training and predictions become computationally infeasible. The author considers a spatially defined data chunking method for large-scale learning problems, leading to so-called localized SVMs, and implements an in-depth mathematical analysis with theoretical guarantees, which in particular include classification rates. The statistical analysis relies on a new and simple partitioning based technique and takes well-known margin conditions into account that describe the behavior of the data-generating distribution. It turns out that the rates outperform known rates of several other learning algorithms under suitable sets of assumptions. From a practical point of view, the author shows that a common training and validation procedure achieves the theoretical rates adaptively, that is, without knowing the margin parameters in advance.