torch_geometric.nn.models.DeepGCNLayer

class DeepGCNLayer(conv: Optional[Module] = None, norm: Optional[Module] = None, act: Optional[Module] = None, block: str = 'res+', dropout: float = 0.0, ckpt_grad: bool = False)[source]

Bases: Module

The skip connection operations from the “DeepGCNs: Can GCNs Go as Deep as CNNs?” and “All You Need to Train Deeper GCNs” papers. The implemented skip connections includes the pre-activation residual connection ("res+"), the residual connection ("res"), the dense connection ("dense") and no connections ("plain").

  • Res+ ("res+"):

\[\text{Normalization}\to\text{Activation}\to\text{Dropout}\to \text{GraphConv}\to\text{Res}\]
  • Res ("res") / Dense ("dense") / Plain ("plain"):

\[\text{GraphConv}\to\text{Normalization}\to\text{Activation}\to \text{Res/Dense/Plain}\to\text{Dropout}\]

Note

For an example of using GENConv, see examples/ogbn_proteins_deepgcn.py.

Parameters:
  • conv (torch.nn.Module, optional) – the GCN operator. (default: None)

  • norm (torch.nn.Module) – the normalization layer. (default: None)

  • act (torch.nn.Module) – the activation layer. (default: None)

  • block (str, optional) – The skip connection operation to use ("res+", "res", "dense" or "plain"). (default: "res+")

  • dropout (float, optional) – Whether to apply or dropout. (default: 0.)

  • ckpt_grad (bool, optional) – If set to True, will checkpoint this part of the model. Checkpointing works by trading compute for memory, since intermediate activations do not need to be kept in memory. Set this to True in case you encounter out-of-memory errors while going deep. (default: False)

forward(*args, **kwargs) Tensor[source]
Return type:

Tensor

reset_parameters()[source]

Resets all learnable parameters of the module.