torch_geometric.nn.conv.GINConv

class GINConv(nn: Callable, eps: float = 0.0, train_eps: bool = False, **kwargs)[source]

Bases: MessagePassing

The graph isomorphism operator from the “How Powerful are Graph Neural Networks?” paper.

\[\mathbf{x}^{\prime}_i = h_{\mathbf{\Theta}} \left( (1 + \epsilon) \cdot \mathbf{x}_i + \sum_{j \in \mathcal{N}(i)} \mathbf{x}_j \right)\]

or

\[\mathbf{X}^{\prime} = h_{\mathbf{\Theta}} \left( \left( \mathbf{A} + (1 + \epsilon) \cdot \mathbf{I} \right) \cdot \mathbf{X} \right),\]

here \(h_{\mathbf{\Theta}}\) denotes a neural network, .i.e. an MLP.

Parameters:
  • nn (torch.nn.Module) – A neural network \(h_{\mathbf{\Theta}}\) that maps node features x of shape [-1, in_channels] to shape [-1, out_channels], e.g., defined by torch.nn.Sequential.

  • eps (float, optional) – (Initial) \(\epsilon\)-value. (default: 0.)

  • train_eps (bool, optional) – If set to True, \(\epsilon\) will be a trainable parameter. (default: False)

  • **kwargs (optional) – Additional arguments of torch_geometric.nn.conv.MessagePassing.

Shapes:
  • input: node features \((|\mathcal{V}|, F_{in})\) or \(((|\mathcal{V_s}|, F_{s}), (|\mathcal{V_t}|, F_{t}))\) if bipartite, edge indices \((2, |\mathcal{E}|)\)

  • output: node features \((|\mathcal{V}|, F_{out})\) or \((|\mathcal{V}_t|, F_{out})\) if bipartite

forward(x: Union[Tensor, Tuple[Tensor, Optional[Tensor]]], edge_index: Union[Tensor, SparseTensor], size: Optional[Tuple[int, int]] = None) Tensor[source]

Runs the forward pass of the module.

reset_parameters()[source]

Resets all learnable parameters of the module.