-
Notifications
You must be signed in to change notification settings - Fork 36
Tensor product kernel #56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
I guess, |
Yes I think the latter would allow to be more flexible for an even more general class of data. |
Yes, I agree. Actually, in my use case I need both a kernel on a space of distributions (the first input space) and a regular vector space (the second input space), so in contrast to the example above it's a bit unclear how to summarize everything in one vector in order to avoid having to deal with an explicit tensor product kernel. |
By tensor product you mean that for N samples with D types of observations you would get a N^D x N^D kernel matrix? |
No, the name tensor product kernel has nothing to do with the kernel matrix but is just the name for a kernel that applies D different kernels to the D different types of observations, instead of applying one kernel on the product space of the D input spaces. To be consistent with the current implementation, IMO the kernel matrix would just be a matrix with the kernel evaluations on all pairs of observations. Hence if we have |
src/KernelFunctions.jl
Outdated
for k in ["exponential","matern","polynomial","constant","rationalquad","exponentiated","cosine"] | ||
include(joinpath("kernels",k*".jl")) | ||
for f in readdir(joinpath(@__DIR__, "basekernels")) | ||
endswith(f, ".jl") && include(joinpath("basekernels", f)) | ||
end | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did we not do this before!!!!
I think we can merge already or would you like to add something? |
I guess it's usable but without a |
Lately, I've been working quite a bit with tensor product kernels. I thought maybe they could be supported by KernelFunctions. However, there are still some open questions about the most reasonable approach for
kernelmatrix
and so on.Each "observation" is a tuple whose first and second argument belong to different input spaces, and hence e.g. of type
Tuple{Vector{Float64},Vector{Int}}
. Shouldkernelmatrix
then take tuples of multiple observations for each input space, i.e., of typeTuple{Matrix{Float64},Matrix{Int}}
, or rather a vector of multiple observations, i.e., of typeVector{Tuple{Vector{Float64},Vector{Int}}}
?