You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks to my project supervisor, I now realize that this is all correct and equivalent to the paper definition.
So it appears that the issue lies with the documentation.
Instead of a vector concatenation, it should state addition, or stick with the original order of operations.
As a small side note, the markup on GATConv is a bit broken, and it says “Dafault” instead of “Default”.
The text was updated successfully, but these errors were encountered:
Thanks to my project supervisor, I now realize that this is all correct and equivalent to the paper definition.
thanks for checking this out, the equivalence is not immediately obvious, maybe we should add a comment about that in the code
So it appears that the issue lies with the documentation. Instead of a vector concatenation, it should state addition, or stick with the original order of operations.
I agree, it should state addition. I'll leave you the honor of filing a PR if you have time, otherwise I'll see to it soon.
The documentation on
GATv2Conv
says:This is not exactly the same as the Equation (7) from “How Attentive are Graph Attention Networks?”:
I also checked the relevant piece of code (
conv.jl
):Thanks to my project supervisor, I now realize that this is all correct and equivalent to the paper definition.
So it appears that the issue lies with the documentation.
Instead of a vector concatenation, it should state addition, or stick with the original order of operations.
As a small side note, the markup on
GATConv
is a bit broken, and it says “Dafault” instead of “Default”.The text was updated successfully, but these errors were encountered: