torch_geometric.nn.conv.AGNNConv
- class AGNNConv(requires_grad: bool = True, add_self_loops: bool = True, **kwargs)[source]
Bases:
MessagePassing
The graph attentional propagation layer from the “Attention-based Graph Neural Network for Semi-Supervised Learning” paper.
\[\mathbf{X}^{\prime} = \mathbf{P} \mathbf{X},\]where the propagation matrix \(\mathbf{P}\) is computed as
\[P_{i,j} = \frac{\exp( \beta \cdot \cos(\mathbf{x}_i, \mathbf{x}_j))} {\sum_{k \in \mathcal{N}(i)\cup \{ i \}} \exp( \beta \cdot \cos(\mathbf{x}_i, \mathbf{x}_k))}\]with trainable parameter \(\beta\).
- Parameters:
- Shapes:
input: node features \((|\mathcal{V}|, F)\), edge indices \((2, |\mathcal{E}|)\)
output: node features \((|\mathcal{V}|, F)\)