Skip to content

Commit 1c71067

Browse files
committed
bugfix & improve writing, replaces #321
1 parent 5b4580e commit 1c71067

File tree

1 file changed

+17
-17
lines changed

1 file changed

+17
-17
lines changed

examples/support-vector-machine/script.jl

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
#
33
# In this notebook we show how you can use KernelFunctions.jl to generate
44
# kernel matrices for classification with a support vector machine, as
5-
# implemented by LIBSVM.
5+
# implemented by [LIBSVM](https://github.com/JuliaML/LIBSVM.jl).
66

77
using Distributions
88
using KernelFunctions
@@ -27,28 +27,28 @@ X1 = [cos.(angle1) sin.(angle1)] .+ 0.1randn(n1, 2)
2727
X2 = [1 .- cos.(angle2) 1 .- sin.(angle2) .- 0.5] .+ 0.1randn(n2, 2)
2828
X = [X1; X2]
2929
x_train = RowVecs(X)
30-
y_train = vcat(fill(-1, nout), fill(1, nin));
30+
y_train = vcat(fill(-1, n1), fill(1, n2));
3131

32-
# Create a 100×100 2D grid for evaluation:
33-
test_range = range(floor(Int, minimum(X)), ceil(Int, maximum(X)); length=100)
34-
x_test = ColVecs(mapreduce(collect, hcat, Iterators.product(test_range, test_range)));
35-
36-
# ## SVM model
32+
# ## Training
3733
#
38-
# Create kernel function:
34+
# We create a kernel function:
3935
k = SqExponentialKernel() ScaleTransform(1.5)
4036

41-
# [LIBSVM](https://github.com/JuliaML/LIBSVM.jl) can make use of a pre-computed kernel matrix.
42-
# KernelFunctions.jl can be used to produce that.
43-
#
44-
# Precomputed matrix for training
37+
# LIBSVM can make use of a pre-computed kernel matrix.
38+
# KernelFunctions.jl can be used to produce that using `kernelmatrix`:
4539
model = svmtrain(kernelmatrix(k, x_train), y_train; kernel=LIBSVM.Kernel.Precomputed)
4640

47-
# Precomputed matrix for prediction
41+
# ## Prediction
42+
#
43+
# For evaluation, we create a 100×100 2D grid based on the extent of the training data:
44+
test_range = range(floor(Int, minimum(X)), ceil(Int, maximum(X)); length=100)
45+
x_test = ColVecs(mapreduce(collect, hcat, Iterators.product(test_range, test_range)));
46+
47+
# Again, we pass the result of KernelFunctions.jl's `kernelmatrix` to LIBSVM:
4848
y_pred, _ = svmpredict(model, kernelmatrix(k, x_train, x_test));
4949

50-
# Visualize prediction on a grid:
50+
# We can see that the kernelized, non-linear classification successfully separates the two classes in the training data:
5151
plot(; xlim=extrema(test_range), ylim=extrema(test_range), aspect_ratio=1)
52-
contourf!(test_range, test_range, y_pred; levels=1, color=cgrad(:redsblues), alpha=0.7)
53-
scatter!(X1[:, 1], X1[:, 2]; color=:red, label="class 1")
54-
scatter!(X2[:, 1], X2[:, 2]; color=:blue, label="class 2")
52+
contourf!(test_range, test_range, y_pred; levels=1, color=cgrad(:redsblues), alpha=0.7, colorbar_title="prediction")
53+
scatter!(X1[:, 1], X1[:, 2]; color=:red, label="training data: class 1")
54+
scatter!(X2[:, 1], X2[:, 2]; color=:blue, label="training data: class 1")

0 commit comments

Comments
 (0)