Skip to content

Commit 51cb463

Browse files
authored
Merge branch 'master' into compathelper/new_version/2021-09-20-00-46-08-917-00277351318
2 parents 350a57c + 924925d commit 51cb463

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

48 files changed

+565
-380
lines changed

.github/workflows/benchmark.yml

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
name: Run benchmarks
2+
3+
on:
4+
pull_request:
5+
6+
jobs:
7+
Benchmark:
8+
runs-on: ubuntu-latest
9+
if: contains(github.event.pull_request.labels.*.name, 'performance critical')
10+
env:
11+
JULIA_DEBUG: BenchmarkCI
12+
steps:
13+
- uses: actions/checkout@v2
14+
- uses: julia-actions/setup-julia@latest
15+
with:
16+
version: 1.6
17+
- name: Install dependencies
18+
run: julia -e 'using Pkg; pkg"add PkgBenchmark BenchmarkCI"'
19+
- name: Run benchmarks
20+
run: julia -e "using BenchmarkCI; BenchmarkCI.judge()"
21+
- name: Post results
22+
run: julia -e "using BenchmarkCI; BenchmarkCI.postjudge()"
23+
env:
24+
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

.github/workflows/ci.yml

Lines changed: 3 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -58,20 +58,8 @@ jobs:
5858
GROUP: ${{ matrix.group }}
5959
- uses: julia-actions/julia-processcoverage@v1
6060
if: matrix.version == '1' && matrix.os == 'ubuntu-latest'
61-
- name: Coveralls parallel
61+
- name: Send coverage to CodeCov
6262
if: matrix.version == '1' && matrix.os == 'ubuntu-latest'
63-
uses: coverallsapp/github-action@master
63+
uses: codecov/codecov-action@v2
6464
with:
65-
github-token: ${{ secrets.GITHUB_TOKEN }}
66-
path-to-lcov: ./lcov.info
67-
flag-name: run-${{ matrix.group }}
68-
parallel: true
69-
finish:
70-
needs: test
71-
runs-on: ubuntu-latest
72-
steps:
73-
- name: Send coverage
74-
uses: coverallsapp/github-action@master
75-
with:
76-
github-token: ${{ secrets.github_token }}
77-
parallel-finished: true
65+
file: lcov.info

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,3 +4,4 @@
44
/test/Manifest.toml
55
coverage/
66
.DS_store
7+
benchmark/Manifest.toml

Project.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name = "KernelFunctions"
22
uuid = "ec8451be-7e33-11e9-00cf-bbf324bd1392"
3-
version = "0.10.18"
3+
version = "0.10.27"
44

55
[deps]
66
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
@@ -30,7 +30,7 @@ Functors = "0.1, 0.2"
3030
IrrationalConstants = "0.1"
3131
LogExpFunctions = "0.2.1, 0.3"
3232
Requires = "1.0.1"
33-
SpecialFunctions = "0.8, 0.9, 0.10, 1"
33+
SpecialFunctions = "0.8, 0.9, 0.10, 1, 2"
3434
StatsBase = "0.32, 0.33"
3535
TensorCore = "0.1"
3636
ZygoteRules = "0.2"

README.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# KernelFunctions.jl
22

33
![CI](https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/workflows/CI/badge.svg?branch=master)
4-
[![Coverage Status](https://coveralls.io/repos/github/JuliaGaussianProcesses/KernelFunctions.jl/badge.svg?branch=master)](https://coveralls.io/github/JuliaGaussianProcesses/KernelFunctions.jl?branch=master)
4+
[![codecov](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/branch/master/graph/badge.svg?token=rmDh3gb7hN)](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl)
55
[![Documentation (stable)](https://img.shields.io/badge/docs-stable-blue.svg)](https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable)
66
[![Documentation (latest)](https://img.shields.io/badge/docs-dev-blue.svg)](https://juliagaussianprocesses.github.io/KernelFunctions.jl/dev)
77
[![ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://img.shields.io/badge/ColPrac-Contributor's%20Guide-blueviolet)](https://github.com/SciML/ColPrac)
@@ -12,9 +12,11 @@
1212

1313
## Kernel functions for machine learning
1414

15-
KernelFunctions.jl provide a flexible and complete framework for kernel functions, pretransforming the input data.
15+
KernelFunctions.jl provides a flexible framework for defining kernel functions, and an extensive collection of implementations.
1616

17-
The aim is to make the API as model-agnostic as possible while still being user-friendly.
17+
The aim is to make the API as model-agnostic as possible while still being user-friendly, and to interoperate well with generic packages for handling parameters like [ParameterHandling.jl](https://github.com/invenia/ParameterHandling.jl/) and FluxML's [Functors.jl](https://github.com/FluxML/Functors.jl/).
18+
19+
Where appropriate, kernels are AD-compatible.
1820

1921
## Examples
2022

@@ -46,12 +48,12 @@ plot(
4648
<img src="docs/src/assets/heatmap_combination.png" width=400px>
4749
</p>
4850

49-
## Packages goals (by priority)
50-
- Ensure AD Compatibility (already the case for Zygote, ForwardDiff)
51-
- Toeplitz Matrices compatibility
51+
## Related Work
5252

5353
Directly inspired by the [MLKernels](https://github.com/trthatcher/MLKernels.jl) package.
5454

55+
See the JuliaGaussianProcesses [Github organisation](https://github.com/JuliaGaussianProcesses) and [website](https://juliagaussianprocesses.github.io/) for more related packages.
56+
5557
## Issues/Contributing
5658

5759
If you notice a problem or would like to contribute by adding more kernel functions or features please [submit an issue](https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/issues), or open a PR (please see the [ColPrac](https://github.com/SciML/ColPrac) contribution guidelines).

benchmark/MLKernels.jl

Lines changed: 0 additions & 22 deletions
This file was deleted.

benchmark/Project.toml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
[deps]
2+
BenchmarkTools = "6e4b80f9-dd63-53aa-95a3-0cdb28fa8baf"
3+
KernelFunctions = "ec8451be-7e33-11e9-00cf-bbf324bd1392"

benchmark/benchmarks.jl

Lines changed: 39 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,46 @@
11
using BenchmarkTools
2-
using Random
3-
using Distances, LinearAlgebra
2+
using KernelFunctions
43

5-
const SUITE = BenchmarkGroup()
4+
N1 = 10
5+
N2 = 20
66

7-
Random.seed!(1234)
7+
X = rand(N1, N2)
8+
Xc = ColVecs(X)
9+
Xr = RowVecs(X)
10+
Xv = collect.(eachcol(X))
11+
Y = rand(N1, N2)
12+
Yc = ColVecs(Y)
13+
Yr = RowVecs(Y)
14+
Yv = collect.(eachcol(Y))
815

9-
dim = 50
10-
N1 = 1000;
11-
N2 = 500;
12-
alpha = 2.0
16+
# Create the general suite of benchmarks
17+
SUITE = BenchmarkGroup()
1318

14-
X = rand(Float64, N1, dim)
15-
Y = rand(Float64, N2, dim)
19+
kernels = Dict(
20+
"SqExponential" => SqExponentialKernel(), "Exponential" => ExponentialKernel()
21+
)
1622

17-
KXY = rand(Float64, N1, N2)
18-
KX = rand(Float64, N1, N1)
19-
sKX = Symmetric(rand(Float64, N1, N1))
20-
kX = rand(Float64, N1)
23+
inputtypes = Dict("ColVecs" => (Xc, Yc), "RowVecs" => (Xr, Yr), "Vecs" => (Xv, Yv))
2124

22-
include("kernelmatrix.jl")
23-
include("MLKernels.jl")
25+
functions = Dict(
26+
"kernelmatrixX" => (kernel, X, Y) -> kernelmatrix(kernel, X),
27+
"kernelmatrixXY" => (kernel, X, Y) -> kernelmatrix(kernel, X, Y),
28+
"kernelmatrix_diagX" => (kernel, X, Y) -> kernelmatrix_diag(kernel, X),
29+
"kernelmatrix_diagXY" => (kernel, X, Y) -> kernelmatrix_diag(kernel, X, Y),
30+
)
31+
32+
for (kname, kernel) in kernels
33+
SUITE[kname] = sk = BenchmarkGroup()
34+
for (inputname, (X, Y)) in inputtypes
35+
sk[inputname] = si = BenchmarkGroup()
36+
for (fname, f) in functions
37+
si[fname] = @benchmarkable $f($kernel, $X, $Y)
38+
end
39+
end
40+
end
41+
42+
# Uncomment the following to run benchmark locally
43+
44+
# tune!(SUITE)
45+
46+
# results = run(SUITE, verbose=true)

benchmark/kernelmatrix.jl

Lines changed: 0 additions & 39 deletions
This file was deleted.

docs/src/api.md

Lines changed: 16 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,22 @@ To find out more about the background, read this [review of kernels for vector-v
8484

8585
KernelFunctions also provides miscellaneous utility functions.
8686
```@docs
87-
kernelpdmat
8887
nystrom
8988
NystromFact
9089
```
90+
91+
## Conditional Utilities
92+
To keep the dependencies of KernelFunctions lean, some functionality is only available if specific other packages are explicitly loaded (`using`).
93+
94+
### Kronecker.jl
95+
[*https://github.com/MichielStock/Kronecker.jl*](https://github.com/MichielStock/Kronecker.jl)
96+
```@docs
97+
kronecker_kernelmatrix
98+
kernelkronmat
99+
```
100+
101+
### PDMats.jl
102+
[*https://github.com/JuliaStats/PDMats.jl*](https://github.com/JuliaStats/PDMats.jl)
103+
```@docs
104+
kernelpdmat
105+
```

docs/src/kernels.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,7 @@ CosineKernel
2727

2828
```@docs
2929
ExponentialKernel
30+
GibbsKernel
3031
LaplacianKernel
3132
SqExponentialKernel
3233
SEKernel
@@ -128,6 +129,9 @@ NormalizedKernel
128129
```
129130

130131
## Multi-output Kernels
132+
Kernelfunctions implements multi-output kernels as scalar kernels on an extended output domain. For more details on this read [the section on inputs for multi-output GPs](@ref Inputs-for-Multiple-Outputs).
133+
134+
For a function ``f(x) \\rightarrow y`` denote the inputs as ``x, x'``, such that we compute the covariance between output components ``y_{p}`` and ``y_{p'}``. The total number of outputs is ``m``.
131135

132136
```@docs
133137
MOKernel

examples/gaussian-process-priors/script.jl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -120,6 +120,7 @@ kernels = [
120120
LinearKernel(),
121121
compose(PeriodicKernel(), ScaleTransform(0.2)),
122122
NeuralNetworkKernel(),
123+
GibbsKernel(; lengthscale=x -> sum(exp sin, x)),
123124
]
124125
plot(
125126
[visualize(k) for k in kernels]...;
Lines changed: 28 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,8 @@
11
# # Support Vector Machine
22
#
3+
# In this notebook we show how you can use KernelFunctions.jl to generate
4+
# kernel matrices for classification with a support vector machine, as
5+
# implemented by LIBSVM.
36

47
using Distributions
58
using KernelFunctions
@@ -8,39 +11,45 @@ using LinearAlgebra
811
using Plots
912
using Random
1013

11-
## Set plotting theme
12-
theme(:wong)
13-
1414
## Set seed
1515
Random.seed!(1234);
1616

17-
# Number of samples:
18-
N = 100;
17+
# ## Generate half-moon dataset
18+
19+
# Number of samples per class:
20+
nin = nout = 50;
1921

20-
# Select randomly between two classes:
21-
y_train = rand([-1, 1], N);
22+
# We generate data based on SciKit-Learn's sklearn.datasets.make_moons function:
2223

23-
# Random attributes for both classes:
24-
X = Matrix{Float64}(undef, 2, N)
25-
rand!(MvNormal(randn(2), I), view(X, :, y_train .== 1))
26-
rand!(MvNormal(randn(2), I), view(X, :, y_train .== -1));
27-
x_train = ColVecs(X);
24+
class1x = cos.(range(0, π; length=nout))
25+
class1y = sin.(range(0, π; length=nout))
26+
class2x = 1 .- cos.(range(0, π; length=nin))
27+
class2y = 1 .- sin.(range(0, π; length=nin)) .- 0.5
28+
X = hcat(vcat(class1x, class2x), vcat(class1y, class2y))
29+
X .+= 0.1randn(size(X))
30+
x_train = RowVecs(X)
31+
y_train = vcat(fill(-1, nout), fill(1, nin));
2832

29-
# Create a 2D grid:
33+
# Create a 100×100 2D grid for evaluation:
3034
test_range = range(floor(Int, minimum(X)), ceil(Int, maximum(X)); length=100)
3135
x_test = ColVecs(mapreduce(collect, hcat, Iterators.product(test_range, test_range)));
3236

37+
# ## SVM model
38+
#
3339
# Create kernel function:
34-
k = SqExponentialKernel() ScaleTransform(2.0)
40+
k = SqExponentialKernel() ScaleTransform(1.5)
3541

3642
# [LIBSVM](https://github.com/JuliaML/LIBSVM.jl) can make use of a pre-computed kernel matrix.
3743
# KernelFunctions.jl can be used to produce that.
38-
# Precomputed matrix for training (corresponds to linear kernel)
44+
#
45+
# Precomputed matrix for training
3946
model = svmtrain(kernelmatrix(k, x_train), y_train; kernel=LIBSVM.Kernel.Precomputed)
4047

4148
# Precomputed matrix for prediction
42-
y_pr, _ = svmpredict(model, kernelmatrix(k, x_train, x_test));
49+
y_pred, _ = svmpredict(model, kernelmatrix(k, x_train, x_test));
4350

44-
# Compute prediction on a grid:
45-
contourf(test_range, test_range, y_pr)
46-
scatter!(X[1, :], X[2, :]; color=y_train, lab="data", widen=false)
51+
# Visualize prediction on a grid:
52+
plot(; lim=extrema(test_range), aspect_ratio=1)
53+
contourf!(test_range, test_range, y_pred; levels=1, color=cgrad(:redsblues), alpha=0.7)
54+
scatter!(X[y_train .== -1, 1], X[y_train .== -1, 2]; color=:red, label="class 1")
55+
scatter!(X[y_train .== +1, 1], X[y_train .== +1, 2]; color=:blue, label="class 2")

src/KernelFunctions.jl

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@ export PiecewisePolynomialKernel
1717
export PeriodicKernel, NeuralNetworkKernel
1818
export KernelSum, KernelProduct, KernelTensorProduct
1919
export TransformedKernel, ScaledKernel, NormalizedKernel
20+
export GibbsKernel
2021

2122
export Transform,
2223
SelectTransform,
@@ -53,11 +54,14 @@ using Functors
5354
using LinearAlgebra
5455
using Requires
5556
using SpecialFunctions: loggamma, besselk, polygamma
56-
using IrrationalConstants: logtwo, twoπ
57+
using IrrationalConstants: logtwo, twoπ, invsqrt2
5758
using LogExpFunctions: softplus
5859
using StatsBase
5960
using TensorCore
60-
using ZygoteRules: ZygoteRules
61+
using ZygoteRules: ZygoteRules, AContext, literal_getproperty, literal_getfield
62+
63+
# Hack to work around Zygote type inference problems.
64+
const Distances_pairwise = Distances.pairwise
6165

6266
abstract type Kernel end
6367
abstract type SimpleKernel <: Kernel end
@@ -94,6 +98,7 @@ include("basekernels/rational.jl")
9498
include("basekernels/sm.jl")
9599
include("basekernels/wiener.jl")
96100

101+
include("kernels/gibbskernel.jl")
97102
include("kernels/scaledkernel.jl")
98103
include("kernels/normalizedkernel.jl")
99104
include("matrix/kernelmatrix.jl")

0 commit comments

Comments
 (0)