Skip to content

Commit 30586ea

Browse files
committed
Merge branch 'master' into periodickernel
2 parents f1dad6e + b267b54 commit 30586ea

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

68 files changed

+1200
-649
lines changed

.travis.yml

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,29 +1,31 @@
11
## Documentation: http://docs.travis-ci.com/user/languages/julia/
22
language: julia
3+
branches:
4+
only:
5+
- master
36
os:
47
- linux
58
- osx
69
julia:
710
- 1.0
8-
- 1.2
911
- 1.3
12+
- 1.4
1013
- nightly
11-
# because of Zygote needs to allow failing on nightly
12-
matrix:
13-
allow_failures:
14-
- julia: nightly
1514
notifications:
1615
email: false
1716
after_success:
18-
# push coverage results to Coveralls
19-
- julia -e 'using Pkg; Pkg.add("Coverage"); using Coverage; Coveralls.submit(process_folder())'
17+
- if [[ $TRAVIS_JULIA_VERSION = 1.4 ]] && [[ $TRAVIS_OS_NAME = linux ]]; then
18+
julia -e 'using Pkg; Pkg.add("Coverage"); using Coverage; Coveralls.submit(process_folder())';
19+
fi
2020
jobs:
2121
include:
2222
- stage: "Documentation"
23-
julia: 1.0
23+
julia: 1.4
2424
os: linux
2525
script:
2626
- export DOCUMENTER_DEBUG=true
2727
- julia --project=docs/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate()'
2828
- julia --project=docs/ docs/make.jl
2929
after_succes: skip
30+
allow_failures:
31+
- julia: nightly

Project.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name = "KernelFunctions"
22
uuid = "ec8451be-7e33-11e9-00cf-bbf324bd1392"
3-
version = "0.3.1"
3+
version = "0.3.2"
44

55
[deps]
66
Compat = "34da2185-b29b-5c13-b0c7-acf172513d20"
@@ -18,7 +18,7 @@ Compat = "2.2, 3"
1818
Distances = "0.8"
1919
Requires = "1.0.1"
2020
SpecialFunctions = "0.8, 0.9, 0.10"
21-
StatsBase = "0.32"
21+
StatsBase = "0.32, 0.33"
2222
StatsFuns = "0.8, 0.9"
2323
ZygoteRules = "0.2"
2424
julia = "1.0"

README.md

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,10 @@
1-
[![Build Status](https://travis-ci.org/theogf/KernelFunctions.jl.svg?branch=master)](https://travis-ci.org/theogf/KernelFunctions.jl)
2-
[![Coverage Status](https://coveralls.io/repos/github/theogf/KernelFunctions.jl/badge.svg?branch=master)](https://coveralls.io/github/theogf/KernelFunctions.jl?branch=master)
3-
[![Documentation](https://img.shields.io/badge/docs-dev-blue.svg)](https://theogf.github.io/KernelFunctions.jl/dev/)
41
# KernelFunctions.jl
2+
3+
[![Build Status](https://travis-ci.com/JuliaGaussianProcesses/KernelFunctions.jl.svg?branch=master)](https://travis-ci.com/JuliaGaussianProcesses/KernelFunctions.jl)
4+
[![Coverage Status](https://coveralls.io/repos/github/JuliaGaussianProcesses/KernelFunctions.jl/badge.svg?branch=master)](https://coveralls.io/github/JuliaGaussianProcesses/KernelFunctions.jl?branch=master)
5+
[![Documentation (stable)](https://img.shields.io/badge/docs-stable-blue.svg)](https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable)
6+
[![Documentation (latest)](https://img.shields.io/badge/docs-dev-blue.svg)](https://juliagaussianprocesses.github.io/KernelFunctions.jl/dev)
7+
58
## Kernel functions for machine learning
69

710
KernelFunctions.jl provide a flexible and complete framework for kernel functions, pretransforming the input data.
@@ -43,4 +46,4 @@ Directly inspired by the [MLKernels](https://github.com/trthatcher/MLKernels.jl)
4346

4447
## Issues/Contributing
4548

46-
If you notice a problem or would like to contribute by adding more kernel functions or features please [submit an issue](https://github.com/theogf/KernelFunctions.jl/issues).
49+
If you notice a problem or would like to contribute by adding more kernel functions or features please [submit an issue](https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/issues).

docs/make.jl

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ makedocs(
1212
"Transform"=>"transform.md",
1313
"Metrics"=>"metrics.md",
1414
"Theory"=>"theory.md",
15+
"Custom Kernels"=>"create_kernel.md",
1516
"API"=>"api.md"]
1617
)
1718

@@ -20,6 +21,6 @@ makedocs(
2021
# for more information.
2122
deploydocs(
2223
deps = Deps.pip("mkdocs", "python-markdown-math"),
23-
repo = "github.com/theogf/KernelFunctions.jl.git",
24+
repo = "github.com/JuliaGaussianProcesses/KernelFunctions.jl.git",
2425
target = "build"
2526
)

docs/src/create_kernel.md

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
## Creating your own kernel
2+
3+
KernelFunctions.jl contains the most popular kernels already but you might want to make your own!
4+
5+
Here is for example how one can define the Squared Exponential Kernel again :
6+
7+
```julia
8+
struct MyKernel <: Kernel end
9+
10+
KernelFunctions.kappa(::MyKernel, d2::Real) = exp(-d2)
11+
KernelFunctions.metric(::MyKernel) = SqEuclidean()
12+
```
13+
14+
For a "Base" kernel, where the kernel function is simply a function applied on some metric between two vectors of real, you only need to:
15+
- Define your struct inheriting from `Kernel`.
16+
- Define a `kappa` function.
17+
- Define the metric used `SqEuclidean`, `DotProduct` etc. Note that the term "metric" is here overabused.
18+
- Optional : Define any parameter of your kernel as `trainable` by Flux.jl if you want to perform optimization on the parameters. We recommend wrapping all parameters in arrays to allow them to be mutable.
19+
20+
Once these functions are defined, you can use all the wrapping functions of KernelFuntions.jl

src/KernelFunctions.jl

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,21 @@
11
"""
2-
KernelFunctions. [Github](https://github.com/theogf/KernelFunctions.jl) [Documentation](https://theogf.github.io/KernelFunctions.jl/dev/)
2+
KernelFunctions. [Github](https://github.com/JuliaGaussianProcesses/KernelFunctions.jl)
3+
[Documentation](https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable/)
34
"""
45
module KernelFunctions
56

67
export kernelmatrix, kernelmatrix!, kerneldiagmatrix, kerneldiagmatrix!, kappa
78
export transform
8-
export params, duplicate, set! # Helpers
9+
export duplicate, set! # Helpers
910

1011
export Kernel
11-
export ConstantKernel, WhiteKernel, ZeroKernel
12+
export ConstantKernel, WhiteKernel, EyeKernel, ZeroKernel
1213
export SqExponentialKernel, ExponentialKernel, GammaExponentialKernel
1314
export ExponentiatedKernel
1415
export MaternKernel, Matern32Kernel, Matern52Kernel
1516
export LinearKernel, PolynomialKernel
1617
export RationalQuadraticKernel, GammaRationalQuadraticKernel
18+
export MahalanobisKernel, GaborKernel, PiecewisePolynomialKernel
1719
export PeriodicKernel
1820
export KernelSum, KernelProduct
1921
export TransformedKernel, ScaledKernel
@@ -46,7 +48,7 @@ include("distances/delta.jl")
4648
include("distances/sinus.jl")
4749
include("transform/transform.jl")
4850

49-
for k in ["constant","exponential","exponentiated","matern","periodic","polynomial","rationalquad"]
51+
for k in ["exponential","matern","polynomial","constant","rationalquad","exponentiated","cosine","maha","fbm","gabor","periodic","piecewisepolynomial"]
5052
include(joinpath("kernels",k*".jl"))
5153
end
5254
include("kernels/transformedkernel.jl")
@@ -55,7 +57,6 @@ include("matrix/kernelmatrix.jl")
5557
include("kernels/kernelsum.jl")
5658
include("kernels/kernelproduct.jl")
5759
include("approximations/nystrom.jl")
58-
5960
include("generic.jl")
6061

6162
include("zygote_adjoints.jl")

src/generic.jl

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,9 +15,13 @@ printshifted(io::IO,κ::Kernel,shift::Int) = print(io,"$κ")
1515
Base.show(io::IO::Kernel) = print(io,nameof(typeof(κ)))
1616

1717
### Syntactic sugar for creating matrices and using kernel functions
18-
for k in subtypes(BaseKernel)
18+
function concretetypes(k, ktypes::Vector)
19+
isempty(subtypes(k)) ? push!(ktypes, k) : concretetypes.(subtypes(k), Ref(ktypes))
20+
return ktypes
21+
end
22+
23+
for k in concretetypes(Kernel, [])
1924
@eval begin
20-
@inline::$k)(d::Real) = kappa(κ,d) #TODO Add test
2125
@inline::$k)(x::AbstractVector{<:Real}, y::AbstractVector{<:Real}) = kappa(κ, x, y)
2226
@inline::$k)(X::AbstractMatrix{T}, Y::AbstractMatrix{T}; obsdim::Integer=defaultobs) where {T} = kernelmatrix(κ, X, Y, obsdim=obsdim)
2327
@inline::$k)(X::AbstractMatrix{T}; obsdim::Integer=defaultobs) where {T} = kernelmatrix(κ, X, obsdim=obsdim)

src/kernels/constant.jl

Lines changed: 22 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
"""
2-
ZeroKernel()
2+
ZeroKernel()
33
44
Create a kernel that always returning zero
55
```
@@ -13,26 +13,40 @@ kappa(κ::ZeroKernel, d::T) where {T<:Real} = zero(T)
1313

1414
metric(::ZeroKernel) = Delta()
1515

16+
Base.show(io::IO, ::ZeroKernel) = print(io, "Zero Kernel")
17+
18+
1619
"""
17-
`WhiteKernel()`
20+
WhiteKernel()
1821
1922
```
2023
κ(x,y) = δ(x,y)
2124
```
22-
Kernel function working as an equivalent to add white noise.
25+
Kernel function working as an equivalent to add white noise. Can also be called via `EyeKernel()`
2326
"""
2427
struct WhiteKernel <: BaseKernel end
2528

26-
kappa::WhiteKernel,δₓₓ::Real) = δₓₓ
29+
"""
30+
EyeKernel()
31+
32+
See [WhiteKernel](@ref)
33+
"""
34+
const EyeKernel = WhiteKernel
35+
36+
kappa::WhiteKernel, δₓₓ::Real) = δₓₓ
2737

2838
metric(::WhiteKernel) = Delta()
2939

40+
Base.show(io::IO, ::WhiteKernel) = print(io, "White Kernel")
41+
42+
3043
"""
31-
`ConstantKernel(c=1.0)`
44+
ConstantKernel(; c=1.0)
45+
46+
Kernel function always returning a constant value `c`
3247
```
3348
κ(x,y) = c
3449
```
35-
Kernel function always returning a constant value `c`
3650
"""
3751
struct ConstantKernel{Tc<:Real} <: BaseKernel
3852
c::Vector{Tc}
@@ -44,3 +58,5 @@ end
4458
kappa::ConstantKernel,x::Real) = first.c)*one(x)
4559

4660
metric(::ConstantKernel) = Delta()
61+
62+
Base.show(io::IO, κ::ConstantKernel) = print(io, "Constant Kernel (c = $(first.c)))")

src/kernels/cosine.jl

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
"""
2+
CosineKernel()
3+
4+
The cosine kernel is a stationary kernel for a sinusoidal given by
5+
```
6+
κ(x,y) = cos( π * (x-y) )
7+
```
8+
"""
9+
struct CosineKernel <: BaseKernel end
10+
11+
kappa::CosineKernel, d::Real) = cospi(d)
12+
metric(::CosineKernel) = Euclidean()
13+
14+
Base.show(io::IO, ::CosineKernel) = print(io, "Cosine Kernel")

src/kernels/exponential.jl

Lines changed: 17 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,31 +1,33 @@
11
"""
2-
`SqExponentialKernel()`
2+
SqExponentialKernel()
33
4-
The squared exponential kernel is an isotropic Mercer kernel given by the formula:
4+
The squared exponential kernel is a Mercer kernel given by the formula:
55
```
66
κ(x,y) = exp(-‖x-y‖²)
77
```
8+
Can also be called via `SEKernel`, `GaussianKernel` or `SEKernel`.
89
See also [`ExponentialKernel`](@ref) for a
910
related form of the kernel or [`GammaExponentialKernel`](@ref) for a generalization.
1011
"""
1112
struct SqExponentialKernel <: BaseKernel end
1213

1314
kappa::SqExponentialKernel, d²::Real) = exp(-d²)
1415
iskroncompatible(::SqExponentialKernel) = true
15-
1616
metric(::SqExponentialKernel) = SqEuclidean()
1717

1818
Base.show(io::IO,::SqExponentialKernel) = print(io,"Squared Exponential Kernel")
1919

2020
## Aliases ##
2121
const RBFKernel = SqExponentialKernel
2222
const GaussianKernel = SqExponentialKernel
23+
const SEKernel = SqExponentialKernel
2324

2425
"""
25-
`ExponentialKernel([ρ=1.0])`
26-
The exponential kernel is an isotropic Mercer kernel given by the formula:
26+
ExponentialKernel()
27+
28+
The exponential kernel is a Mercer kernel given by the formula:
2729
```
28-
κ(x,y) = exp(-ρ‖x-y‖)
30+
κ(x,y) = exp(-‖x-y‖)
2931
```
3032
"""
3133
struct ExponentialKernel <: BaseKernel end
@@ -34,21 +36,24 @@ kappa(κ::ExponentialKernel, d::Real) = exp(-d)
3436
iskroncompatible(::ExponentialKernel) = true
3537
metric(::ExponentialKernel) = Euclidean()
3638

37-
Base.show(io::IO,::ExponentialKernel) = print(io,"Exponential Kernel")
39+
Base.show(io::IO, ::ExponentialKernel) = print(io, "Exponential Kernel")
3840

3941
## Alias ##
4042
const LaplacianKernel = ExponentialKernel
4143

4244
"""
43-
`GammaExponentialKernel([ρ=1.0, [γ=2.0]])`
45+
GammaExponentialKernel(; γ = 2.0)
46+
4447
The γ-exponential kernel is an isotropic Mercer kernel given by the formula:
4548
```
46-
κ(x,y) = exp(-ρ^(2γ)‖x-y‖^(2γ))
49+
κ(x,y) = exp(-‖x-y‖^(2γ))
4750
```
51+
Where `γ > 0`, (the keyword `γ` can be replaced by `gamma`)
52+
For `γ = 1`, see `SqExponentialKernel` and `γ = 0.5`, see `ExponentialKernel`
4853
"""
4954
struct GammaExponentialKernel{Tγ<:Real} <: BaseKernel
5055
γ::Vector{Tγ}
51-
function GammaExponentialKernel(;γ::T=2.0) where {T<:Real}
56+
function GammaExponentialKernel(; gamma::T=2.0, γ::T=gamma) where {T<:Real}
5257
@check_args(GammaExponentialKernel, γ, γ >= zero(T), "γ > 0")
5358
return new{T}([γ])
5459
end
@@ -57,3 +62,5 @@ end
5762
kappa::GammaExponentialKernel, d²::Real) = exp(-^first.γ))
5863
iskroncompatible(::GammaExponentialKernel) = true
5964
metric(::GammaExponentialKernel) = SqEuclidean()
65+
66+
Base.show(io::IO, κ::GammaExponentialKernel) = print(io, "Gamma Exponential Kernel (γ = $(first.γ)))")

src/kernels/exponentiated.jl

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,15 @@
11
"""
2-
`ExponentiatedKernel([ρ=1])`
2+
ExponentiatedKernel()
3+
34
The exponentiated kernel is a Mercer kernel given by:
45
```
5-
κ(x,y) = exp(ρ²xᵀy)
6+
κ(x,y) = exp(xᵀy)
67
```
78
"""
89
struct ExponentiatedKernel <: BaseKernel end
910

1011
kappa::ExponentiatedKernel, xᵀy::Real) = exp(xᵀy)
11-
1212
metric(::ExponentiatedKernel) = DotProduct()
13-
1413
iskroncompatible(::ExponentiatedKernel) = true
14+
15+
Base.show(io::IO, ::ExponentiatedKernel) = print(io, "Exponentiated Kernel")

0 commit comments

Comments
 (0)