You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/userguide.md
+64-8Lines changed: 64 additions & 8 deletions
Original file line number
Diff line number
Diff line change
@@ -7,26 +7,82 @@ For example to create a square exponential kernel
7
7
```julia
8
8
k =SqExponentialKernel()
9
9
```
10
-
All kernels can take as argument a `Transform` object (see [Transform](@ref)) which is directly going to act on the inputs before it's processes.
11
-
But it's also possible to simply give a scalar or a vector if all you are interested in is to modify the lengthscale, respectively for all dimensions or independently for each dimension.
10
+
Instead of having lengthscale(s) for each kernel we use `Transform` objects (see [Transform](@ref)) which are directly going to act on the inputs before passing them to the kernel.
11
+
For example to premultiply the input by 2.0 we create the kernel the following options are possible
12
+
```julia
13
+
k =transform(SqExponentialKernel(),ScaleTransform(2.0)) # returns a TransformedKernel
14
+
k =@kernelSqExponentialKernel() l=2.0# Will be available soon
15
+
k =TransformedKernel(SqExponentialKernel(),ScaleTransform(2.0))
16
+
```
17
+
Check the [`Transform`](@ref) page to see the other options.
18
+
To premultiply the kernel by a variance, you can use `*` or create a `ScaledKernel`
19
+
```julia
20
+
k =3.0*SqExponentialKernel()
21
+
k =ScaledKernel(SqExponentialKernel(),3.0)
22
+
@kernel3.0*SqExponentialKernel()
23
+
```
24
+
25
+
## Using a kernel function
26
+
27
+
To compute the kernel function on two vectors you can call
28
+
```julia
29
+
k =SqExponentialKernel()
30
+
x1 =rand(3); x2 =rand(3)
31
+
kappa(k,x1,x2) ==k(x1,x2) # Syntactic sugar
32
+
```
12
33
13
-
## Kernel matrix creation
34
+
## Creating a kernel matrix
14
35
15
-
Matrix are created via the `kernelmatrix` function or `kerneldiagmatrix`.
36
+
Kernel matrices can be created via the `kernelmatrix` function or `kerneldiagmatrix` for only the diagonal.
16
37
An important argument to give is the dimensionality of the input `obsdim`. It tells if the matrix is of the type `# samples X # features` (`obsdim`=1) or `# features X # samples`(`obsdim`=2) (similarly to [Distances.jl](https://github.com/JuliaStats/Distances.jl))
17
38
For example:
18
39
```julia
19
40
k =SqExponentialKernel()
20
41
A =rand(10,5)
21
42
kernelmatrix(k,A,obsdim=1) # Return a 10x10 matrix
22
43
kernelmatrix(k,A,obsdim=2) # Return a 5x5 matrix
44
+
k(A,obsdim=1) # Syntactic sugar
23
45
```
24
46
25
47
We also support specific kernel matrices outputs:
26
-
- For a positive-definite matrix object`PDMat` from [`PDMats.jl`](https://github.com/JuliaStats/PDMats.jl). Call `kernelpdmat(k,A,obsdim=1)`, it will create a matrix and in case of bad conditionning will add some diagonal noise until the matrix is considered PSD, it will then return a `PDMat` object. For this method to work in your code you need to include `using PDMats` first
27
-
- For a Kronecker matrix, we rely on [`Kronecker.jl`](https://github.com/MichielStock/Kronecker.jl). We give two methods : `kernelkronmat(k,[x,y,z])` where `x``y` and `z` are vectors which will return a `KroneckerProduct`, and `kernelkronmat(k,x,dims)` where `x` is a vector and dims and the number of features. Make sure that `k` is a vector compatible with such constructions (with `iskroncompatible`). Both method will return a . For those methods to work in your code you need to include `using Kronecker` first
48
+
- For a positive-definite matrix object`PDMat` from [`PDMats.jl`](https://github.com/JuliaStats/PDMats.jl), you can call the following:
49
+
```julia
50
+
using PDMats
51
+
k =SqExponentialKernel()
52
+
K =kernelpdmat(k,A,obsdim=1) # PDMat
53
+
```
54
+
It will create a matrix and in case of bad conditionning will add some diagonal noise until the matrix is considered PSD, it will then return a `PDMat` object. For this method to work in your code you need to include `using PDMats` first
55
+
- For a Kronecker matrix, we rely on [`Kronecker.jl`](https://github.com/MichielStock/Kronecker.jl). Here are two examples:
56
+
```julia
57
+
using Kronecker
58
+
x =range(0,1,length=10)
59
+
y =range(0,1,length=50)
60
+
K =kernelkronmat(k,[x,y]) # Kronecker matrix
61
+
K =kernelkronmat(k,x,5) # Kronecker matrix
62
+
```
63
+
Make sure that `k` is a vector compatible with such constructions (with `iskroncompatible`). Both method will return a . For those methods to work in your code you need to include `using Kronecker` first
64
+
- For a Nystrom approximation : `kernelmatrix(nystrom(k, X, ρ, obsdim = 1))` where `ρ` is the proportion of sampled used.
28
65
29
-
## Kernel manipulation
66
+
## Composite kernels
30
67
31
68
One can create combinations of kernels via `KernelSum` and `KernelProduct` or using simple operators `+` and `*`.
32
-
For
69
+
For example :
70
+
```julia
71
+
k1 =SqExponentialKernel()
72
+
k2 =Matern32Kernel()
73
+
k =0.5*k1 +0.2*k2 # KernelSum
74
+
k = k1*k2 # KernelProduct
75
+
```
76
+
77
+
## Kernel Parameters
78
+
79
+
What if you want to differentiate through the kernel parameters? Even in a highly nested structure such as :
80
+
```julia
81
+
k =transform(0.5*SqExponentialKernel()*MaternKernel()+0.2*(transform(LinearKernel(),2.0)+PolynomialKernel()),[0.1,0.5])
82
+
```
83
+
One can get the array of parameters to optimize via `params` from `Flux.jl`
0 commit comments