Skip to content

Update README.md #446

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 13, 2022
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,14 @@
[![DOI](https://zenodo.org/badge/188430419.svg)](https://zenodo.org/badge/latestdoi/188430419)



## Kernel functions for machine learning

KernelFunctions.jl provides a flexible framework for defining kernel functions, and an extensive collection of implementations.

The aim is to make the API as model-agnostic as possible while still being user-friendly, and to interoperate well with generic packages for handling parameters like [ParameterHandling.jl](https://github.com/invenia/ParameterHandling.jl/) and FluxML's [Functors.jl](https://github.com/FluxML/Functors.jl/).

Where appropriate, kernels are AD-compatible.
**KernelFunctions.jl** is a general purpose [kernel](https://en.wikipedia.org/wiki/Positive-definite_kernel) package.
It provides a flexible framework for creating kernel functions and manipulating them, and an extensive collection of implementations.
The main goals of this package are:
- **Flexibility**: operations between kernels should be fluid and easy without breaking, with a user-friendly API.
- **Plug-and-play**: being model-agnostic; including the kernels before/after other steps should be straightforward. To interoperate well with generic packages for handling parameters like [ParameterHandling.jl](https://github.com/invenia/ParameterHandling.jl/) and FluxML's [Functors.jl](https://github.com/FluxML/Functors.jl/).
- **Automatic Differentiation compatibility**: all kernel functions which _ought_ to be differentiable using AD packages like [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) or [Zygote.jl](https://github.com/FluxML/Zygote.jl) _should_ be.
Comment on lines +14 to +19
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copied this over from https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/blob/master/docs/src/index.md and edited it slightly. If you're happy with this I'll also "back-port" it to docs/src/index.md.


## Examples

Expand Down Expand Up @@ -50,9 +50,9 @@ plot(

## Related Work

Directly inspired by the [MLKernels](https://github.com/trthatcher/MLKernels.jl) package.
This package replaces the now-defunct [MLKernels.jl](https://github.com/trthatcher/MLKernels.jl). It incorporates lots of excellent existing work from packages such as [GaussianProcesses.jl](https://github.com/STOR-i/GaussianProcesses.jl), and is used in downstream packages such as [AbstractGPs.jl](https://github.com/JuliaGaussianProcesses/AbstractGPs.jl), [ApproximateGPs.jl](https://github.com/JuliaGaussianProcesses/ApproximateGPs.jl), [Stheno.jl](https://github.com/willtebbutt/Stheno.jl), and [AugmentedGaussianProcesses.jl](https://github.com/theogf/AugmentedGaussianProcesses.jl).

See the JuliaGaussianProcesses [Github organisation](https://github.com/JuliaGaussianProcesses) and [website](https://juliagaussianprocesses.github.io/) for more related packages.
See the JuliaGaussianProcesses [Github organisation](https://github.com/JuliaGaussianProcesses) and [website](https://juliagaussianprocesses.github.io/) for more information.

## Issues/Contributing

Expand Down