torch_geometric.nn.norm.GraphNorm

class GraphNorm(in_channels: int, eps: float = 1e-05)[source]

Bases: Module

Applies graph normalization over individual graphs as described in the “GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training” paper.

\[\mathbf{x}^{\prime}_i = \frac{\mathbf{x} - \alpha \odot \textrm{E}[\mathbf{x}]} {\sqrt{\textrm{Var}[\mathbf{x} - \alpha \odot \textrm{E}[\mathbf{x}]] + \epsilon}} \odot \gamma + \beta\]

where \(\alpha\) denotes parameters that learn how much information to keep in the mean.

Parameters:
  • in_channels (int) – Size of each input sample.

  • eps (float, optional) – A value added to the denominator for numerical stability. (default: 1e-5)

reset_parameters()[source]

Resets all learnable parameters of the module.

forward(x: Tensor, batch: Optional[Tensor] = None, batch_size: Optional[int] = None) Tensor[source]

Forward pass.

Parameters:
  • x (torch.Tensor) – The source tensor.

  • batch (torch.Tensor, optional) – The batch vector \(\mathbf{b} \in {\{ 0, \ldots, B-1\}}^N\), which assigns each element to a specific example. (default: None)

  • batch_size (int, optional) – The number of examples \(B\). Automatically calculated if not given. (default: None)