Skip to content

Missing functionality compared to DGL #41

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
34 of 59 tasks
CarloLucibello opened this issue Sep 19, 2021 · 3 comments
Open
34 of 59 tasks

Missing functionality compared to DGL #41

CarloLucibello opened this issue Sep 19, 2021 · 3 comments
Labels
good first issue Good for newcomers

Comments

@CarloLucibello
Copy link
Member

CarloLucibello commented Sep 19, 2021

Checklist of stuff we miss compared to Deep Graph Library.
PRs are welcome!

Conv Layers

  • GraphConv (called GCNConv here)
  • EdgeWeightNorm
  • RelGraphConv
  • TAGConv
  • GATConv
  • EdgeConv
  • SAGEConv
  • SGConv
  • APPNPConv
  • GINConv
  • GatedGraphConv
  • GMMConv (Added GMMConv #147)
  • ChebConv
  • AGNNConv
  • NNConv
  • AtomicConv
  • CFConv
  • DotGatConv
  • TWIRLSConv
  • TWIRLSUnfoldingAndAttention
  • GCN2Conv

Dense Conv Layers

  • DenseGraphConv
  • DenseSAGEConv
  • DenseChebConv

Global Pooling Layers

  • SumPooling (GlobalPooling(+) here)
  • AvgPooling (GlobalPooling(mean) here)
  • MaxPooling (GlobalPooling(max) here)
  • SortPooling
  • WeightAndSum
  • GlobalAttentionPooling
  • Set2Set
  • SetTransformerEncoder
  • SetTransformerDecoder

Batching and Reading Out Ops

https://docs.dgl.ai/en/0.6.x/api/python/dgl.html#batching-and-reading-out-ops

  • batch. Use Flux.batch or SparseArrays.blockdiag
  • unbatch
  • readout_nodes (called reduce_nodes here)
  • readout_edges (called reduce_edges here)
  • sum_nodes # use reduce_nodes(+, g, x)
  • sum_edges # use reduce_edges(+, g, x)
  • mean_nodes
  • mean_edges
  • max_nodes
  • max_edges
  • softmax_nodes
  • softmax_edges
  • broadcast_nodes
  • broadcast_edges
  • topk_nodes
  • topk_edges

Adjacency Related Utilities

  • khop_adj
  • laplacian_lambda_max

nn.functional

https://docs.dgl.ai/api/python/nn.functional.html

  • edge_softmax (softmax_edge_neighbors here)

optim

https://docs.dgl.ai/api/python/dgl.optim.html

  • Sparse Adam
  • Sparse AdaGrad

nn Utility Modules

  • Sequential (GNNChain here)
  • WeightBasis
  • KNNGraph
  • SegmentedKNNGraph

nn NodeEmbedding Module

  • NodeEmbedding

Sampling and Stochastic training

.....

Distributed Training

....

@CarloLucibello CarloLucibello changed the title missing layers (compared to DGL) Missing functionality compared to DGL Sep 29, 2021
@oysteinsolheim
Copy link
Contributor

Would it be possible to also have a list of wished-for-algorithms, included or not in DGL? :-) For example I'd love to see edge-featured-based algorithms in general, and maybe more specifically the "Exploiting Edge Features in Graph Neural Networks" from .https://arxiv.org/abs/1809.02709

@CarloLucibello
Copy link
Member Author

Of course! You can file a separate issue for each feature request and I'll try to get to them as soon as I can spare some time if no one beats me to it. For instance, "Exploiting Edge Features in Graph Neural Networks" seems to be cited enough that it is worth having, so you are welcome to open a new issue.

@oysteinsolheim
Copy link
Contributor

Perfect! :-)

This was referenced Dec 6, 2022
@CarloLucibello CarloLucibello added the good first issue Good for newcomers label Jan 10, 2023
This was referenced Mar 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants