Closed-form Gaussian spread estimation for small and large support vector classification
The support vector machine with Gaussian kernel often achieves state-of-the-art performance in classification problems, but requires the tuning of the kernel spread. Most optimization methods for spread tuning require training, being slow and not suited for large-scale datasets. We formulate an analytic expression to calculate, directly from data without iterative search, the spread minimizing the difference between Gaussian and ideal kernel matrices. The proposed direct gamma tuning equals performance and is one-two orders of magnitude faster than state-of-the art approaches on 30 small datasets. Combined with random sampling of training patterns, it also runs on large classification problems. Our method is very efficient in experiments with 20 large datasets up to 31 millions of patterns, it is faster and performs significantly better than linear support vector machine, and it is also faster than iterative minimization. Code is available upon paper acceptance from http://persoal.citius.usc.es/manuel.fernandez.delgado/papers/dgt/index.html and from CodeOcean: https://codeocean.com/capsule/4271163/tree/v1.
keywords: Classification, Efficient computing, Large-scale datasets, Model selection, Radial basis kernel, support vector machine (SVM)