Implementing graph co-attention in PyTorch Geometric #4821
davidbuterez
started this conversation in
Ideas
Replies: 1 comment 1 reply
-
In general, "co-attention" is applied for every node in one graph to another node in the other graph, right? In that case, I think it makes more sense to implement a "dense" version of it without the |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi @rusty1s,
I noticed that currently there is no graph co-attention implementation available in PyTorch Geometric. I am interested in exploring this model and attempted to implement it. The goal is to model interactions between two graphs with an attention mechanism, leading to two "outer" messages as well as two "inner" (usual) messages.The tricky part to implement in PyG seems to be the outer message co-attention, which requires every node i of graph 1 to interact with every node j of graph 2 (as opposed to just every other node j of graph 1, which is what the existing PyG propagate mechanism seems to support).
I am not used to implementing graph layers in PyG and I am not sure if PyG supports the required functionality in a natural way. I started from the GAT implementation and I was thinking of something along these lines, using repeats for the interactions between the two graphs. This is a first attempt so probably does not compute the correct thing but I was wondering if it would be the correct approach in PyG and if you have any advice!
Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions