Skip to content

Created concrete types to call syntactic sugar on all kernels #57

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Mar 30, 2020

Conversation

theogf
Copy link
Member

@theogf theogf commented Mar 25, 2020

Should solve #55
and similar for all non BaseKernels (until we drop support for 1.0)

@theogf
Copy link
Member Author

theogf commented Mar 25, 2020

I was not aware of any way to get all the concrete types from an abstract type so I needed to add a small recursive function.

@devmotion
Copy link
Member

Is the idea to handle only the kernels in KernelFunctions or also custom user-defined kernels?

To me it seems this will only cover kernels that are already defined when the package is loaded - however, since KernelFunctions has to be loaded first before a user can define custom kernels, it will only cover the kernels in KernelFunctions.

Moreover, these input types might not make sense at all for kernels that don't define a metric (and hence shouldn't call into kappa(kernel, d)) or operate on completely different input spaces (see the discussion about tensor product kernels).

So I think we might not even want to define these dispatches for all kernels in KernelFunctions.

@theogf
Copy link
Member Author

theogf commented Mar 25, 2020

But kernels which don't have a proper metric, etc, would just return an error, in the same way that calling kappa(k, d), it does not seem like a bad behavior to me.
The issue of the new kernels was talked about but it seems that the only possible solution is to drop Julia <= 1.2 support

@devmotion
Copy link
Member

Yes, I know there's probably no nice solution for Julia <= 1.2, it just seems to me even on Julia > 1.3 these definitions might not be useful for the two reasons outlined above. Regarding the kernels with metrics, maybe we should just add a trait so that we can dispatch on if a kernel has a metric or not. Or even make that an abstract subtype (thinking about abstract method definitions for these kernels on Julia > 1.2) and rather make BaseKernels a trait.

@devmotion
Copy link
Member

That's why I thought from a mathematical perspective it's a bit weird if kernelmatrix takes inputs of type Tuple, but from a programmer perspective it would be neat since then one could write:

function kernelmatrix(kernel::TensorProductKernel, (X1, X2)::Tuple, (Y1, Y2)::Tuple)
    return kernelmatrix(kernel.kernel1, X1, Y1) .* kernelmatrix(kernel.kernel2, X2, Y2)
end 

@theogf
Copy link
Member Author

theogf commented Mar 26, 2020

Ok I think I found what is probably the best option for now. We should just remove altogether the (k::Kernel)(d::Real) syntax. I don't think it's useful for anyone and it's too specific. The rest of the function should still be valid for any kernel I believe.

@devmotion
Copy link
Member

The rest of the function should still be valid for any kernel I believe.

Not for a tensor product kernel, e.g., but at least for all kernels with real-valued vectors as inputs (and hence all currently implemented kernels in KernelFunctions I guess) 😉

@theogf
Copy link
Member Author

theogf commented Mar 26, 2020

Fair enough! But maybe we can do like @sharanry did for one of the kernels : add it to an exception list.

@devmotion
Copy link
Member

Sure, but it's not even part of KernelFunctions so it doesn't matter right now 🤷‍♀️

return new{T}([h])
end
end

Base.show(io::IO, κ::FBMKernel) = print(io, "Fractional Brownian Motion Kernel (h = $(first(k.h)))")

_fbm(modX, modY, modXY, h) = (modX^h + modY^h - modXY^h)/2
_fbm(modX, modY, modXY, h) = (modX^h + modY^h - abs(modXY)^h)/2
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is abs actually needed? modXY is supposed to be computed with SqEuclidean or abs2 in all functions that call _fbm.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess it might be due to https://github.com/JuliaStats/Distances.jl#precision-for-euclidean-and-sqeuclidean

Maybe a proper fix in Distances would be to set it to zero in case of negative values.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could also just use the roundoff argument when the computations are sensible (like for FBM)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO it's more appropriate to fix this issue in Distances, we shouldn't fix round-off values here. I opened a PR: JuliaStats/Distances.jl#161

@theogf theogf merged commit 1b0a74e into JuliaGaussianProcesses:master Mar 30, 2020
@theogf theogf deleted the syntacticsugarallkernels branch March 30, 2020 12:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants