-
Notifications
You must be signed in to change notification settings - Fork 36
Created concrete types to call syntactic sugar on all kernels #57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Created concrete types to call syntactic sugar on all kernels #57
Conversation
I was not aware of any way to get all the concrete types from an abstract type so I needed to add a small recursive function. |
Is the idea to handle only the kernels in KernelFunctions or also custom user-defined kernels? To me it seems this will only cover kernels that are already defined when the package is loaded - however, since KernelFunctions has to be loaded first before a user can define custom kernels, it will only cover the kernels in KernelFunctions. Moreover, these input types might not make sense at all for kernels that don't define a metric (and hence shouldn't call into So I think we might not even want to define these dispatches for all kernels in KernelFunctions. |
But kernels which don't have a proper metric, etc, would just return an error, in the same way that calling |
Yes, I know there's probably no nice solution for Julia <= 1.2, it just seems to me even on Julia > 1.3 these definitions might not be useful for the two reasons outlined above. Regarding the kernels with metrics, maybe we should just add a trait so that we can dispatch on if a kernel has a metric or not. Or even make that an abstract subtype (thinking about abstract method definitions for these kernels on Julia > 1.2) and rather make BaseKernels a trait. |
That's why I thought from a mathematical perspective it's a bit weird if function kernelmatrix(kernel::TensorProductKernel, (X1, X2)::Tuple, (Y1, Y2)::Tuple)
return kernelmatrix(kernel.kernel1, X1, Y1) .* kernelmatrix(kernel.kernel2, X2, Y2)
end |
Ok I think I found what is probably the best option for now. We should just remove altogether the |
Not for a tensor product kernel, e.g., but at least for all kernels with real-valued vectors as inputs (and hence all currently implemented kernels in KernelFunctions I guess) 😉 |
Fair enough! But maybe we can do like @sharanry did for one of the kernels : add it to an exception list. |
Sure, but it's not even part of KernelFunctions so it doesn't matter right now 🤷♀️ |
src/kernels/fbm.jl
Outdated
return new{T}([h]) | ||
end | ||
end | ||
|
||
Base.show(io::IO, κ::FBMKernel) = print(io, "Fractional Brownian Motion Kernel (h = $(first(k.h)))") | ||
|
||
_fbm(modX, modY, modXY, h) = (modX^h + modY^h - modXY^h)/2 | ||
_fbm(modX, modY, modXY, h) = (modX^h + modY^h - abs(modXY)^h)/2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is abs
actually needed? modXY
is supposed to be computed with SqEuclidean
or abs2
in all functions that call _fbm
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well looks like SqEuclidean
has some issues then : https://travis-ci.com/github/JuliaGaussianProcesses/KernelFunctions.jl/jobs/308331838
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess it might be due to https://github.com/JuliaStats/Distances.jl#precision-for-euclidean-and-sqeuclidean
Maybe a proper fix in Distances would be to set it to zero in case of negative values.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could also just use the roundoff argument when the computations are sensible (like for FBM)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO it's more appropriate to fix this issue in Distances, we shouldn't fix round-off values here. I opened a PR: JuliaStats/Distances.jl#161
Should solve #55
and similar for all non BaseKernels (until we drop support for 1.0)