Kernlab kraziness: inconsistent results for identical problems

I found some bewilderment in the kernlab package: evaluating SVMs that are mathematically identical leads to different results in the software.

This piece of code simply takes aperture data and makes it a binary classification task for simplicity. As you can see, I use linear kernels in both SVMs.

library(kernlab) library(e1071) data(iris) x <- as.matrix(iris[, 1:4]) y <- as.factor(ifelse(iris[, 5] == 'versicolor', 1, -1)) C <- 5.278031643091578 svm1 <- ksvm(x = x, y = y, scaled = FALSE, kernel = 'vanilladot', C = C) K <- kernelMatrix(vanilladot(), x) svm2 <- ksvm(x = K, y = y, C = C, kernel = 'matrix') svm3 <- svm(x = x, y = y, scale = FALSE, kernel = 'linear', cost = C) 

However, the summary information about svm1 and svm2 differs sharply : kernlab reports completely different values โ€‹โ€‹of support vectors, learning error rates, and values โ€‹โ€‹of the objective function between the two models.

 > svm1 Support Vector Machine object of class "ksvm" SV type: C-svc (classification) parameter : cost C = 5.27803164309158 Linear (vanilla) kernel function. Number of Support Vectors : 89 Objective Function Value : -445.7911 Training error : 0.26 > svm2 Support Vector Machine object of class "ksvm" SV type: C-svc (classification) parameter : cost C = 5.27803164309158 [1] " Kernel matrix used as input." Number of Support Vectors : 59 Objective Function Value : -292.692 Training error : 0.333333 

For comparison, I also computed the same model using e1071, which provides an R interface for the libsvm package.

 svm3 Call: svm.default(x = x, y = y, scale = FALSE, kernel = "linear", cost = C) Parameters: SVM-Type: C-classification SVM-Kernel: linear cost: 5.278032 gamma: 0.25 Number of Support Vectors: 89 It reports 89 support vectors, the same as svm1. 

My question is: are there any known bugs in the kernlab package that can account for this unusual behavior.

(Kernlab for R is an SVM solver that allows you to use one of several pre-packaged kernel functions or a user-supplied kernel matrix. The output is an estimate of the support vector machine for user-provided hyperparameters.)

+4
r machine-learning svm kernlab
Nov 19 '15 at 20:41
source share
1 answer

After reviewing part of the code, it turns out that this is a violation line:

https://github.com/cran/kernlab/blob/efd7d91521b439a993efb49cf8e71b57fae5fc5a/src/svm.cpp#L4205

That is, in the case of a user-supplied kernel matrix, ksvm just looks at two dimensions, not the input dimension. This seems odd, and probably a distraction from testing or something else. Tests of the linear core with data from only two measurements give the same result: replace 1:4 with 1:2 in the above, and the conclusion and forecasts all agree.

+2
Dec 01 '15 at 18:00
source share



All Articles