Temporal Graph-Convolutional Layers

Convolutions for time-varying graphs (temporal graphs) such as the TemporalSnapshotsGNNGraph.

GraphNeuralNetworks.DCGRUCellType
DCGRUCell(in => out, k; [bias, init])

Diffusion Convolutional Recurrent Neural Network (DCGRU) cell from the paper Diffusion Convolutional Recurrent Neural Network: Data-driven Traffic Forecasting.

Applyis a DConv layer to model spatial dependencies, in combination with a Gated Recurrent Unit (GRU) cell to model temporal dependencies.

Arguments

  • in: Number of input node features.
  • out: Number of output node features.
  • k: Diffusion step for the DConv.
  • bias: Add learnable bias. Default true.
  • init: Convolution weights' initializer. Default glorot_uniform.

Forward

cell(g::GNNGraph, x, [h])
  • g: The input graph.
  • x: The node features. It should be a matrix of size in x num_nodes.
  • h: The current state of the GRU cell. It is a matrix of size out x num_nodes. If not provided, it is assumed to be a matrix of zeros.

Performs one recurrence step and returns a tuple (h, h), where h is the updated hidden state of the GRU cell.

Examples

julia> using GraphNeuralNetworks, Flux

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);

julia> x = [rand(Float32, d_in, num_nodes) for t in 1:timesteps];

julia> cell = DCGRUCell(d_in => d_out, 2);

julia> state = Flux.initialstates(cell);

julia> y = state;

julia> for xt in x
           y, state = cell(g, xt, state)
       end

julia> size(y) # (d_out, num_nodes)
(3, 5)
source
GraphNeuralNetworks.EvolveGCNOCellType

" EvolveGCNOCell(in => out; bias = true, init = glorot_uniform)

Evolving Graph Convolutional Network cell of type "-O" from the paper EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs.

Uses a GCNConv layer to model spatial dependencies, and an LSTMCell to model temporal dependencies. Can work with time-varying graphs and node features.

Arguments

  • in => out: A pair where in is the number of input node features and out is the number of output node features.
  • bias: Add learnable bias for the convolution and the lstm cell. Default true.
  • init: Weights' initializer for the convolution. Default glorot_uniform.

Forward

cell(g::GNNGraph, x, [state]) -> x, state
  • g: The input graph.
  • x: The node features. It should be a matrix of size in x num_nodes.
  • state: The current state of the cell. A state is a tuple (weight, lstm) where weight is the convolution's weight and lstm is the lstm's state. If not provided, it is generated by calling Flux.initialstates(cell).

Returns the updated node features x and the updated state.

julia> using GraphNeuralNetworks, Flux

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = [rand_graph(num_nodes, num_edges) for t in 1:timesteps];

julia> x = [rand(Float32, d_in, num_nodes) for t in 1:timesteps];

julia> cell1 = EvolveGCNOCell(d_in => d_out)
EvolveGCNOCell(2 => 3)  # 321 parameters

julia> cell2 = EvolveGCNOCell(d_out => d_out)
EvolveGCNOCell(3 => 3)  # 696 parameters

julia> state1 = Flux.initialstates(cell1);

julia> state2 = Flux.initialstates(cell2);

julia> outputs = [];

julia> for t in 1:timesteps
           zt, state1 = cell1(g[t], x[t], state1)
           yt, state2 = cell2(g[t], zt, state2)
           outputs = vcat(outputs, [yt])
       end

julia> size(outputs[end]) # (d_out, num_nodes)
(3, 5)
source
GraphNeuralNetworks.GConvGRUCellType
GConvGRUCell(in => out, k; [bias, init])

Graph Convolutional Gated Recurrent Unit (GConvGRU) recurrent cell from the paper Structured Sequence Modeling with Graph Convolutional Recurrent Networks.

Uses ChebConv to model spatial dependencies, followed by a Gated Recurrent Unit (GRU) cell to model temporal dependencies.

Arguments

  • in => out: A pair where in is the number of input node features and out is the number of output node features.
  • k: Chebyshev polynomial order.
  • bias: Add learnable bias. Default true.
  • init: Weights' initializer. Default glorot_uniform.

Forward

cell(g::GNNGraph, x, [h])
  • g: The input graph.
  • x: The node features. It should be a matrix of size in x num_nodes.
  • h: The current hidden state of the GRU cell. If given, it is a matrix of size out x num_nodes. If not provided, it is assumed to be a matrix of zeros.

Performs one recurrence step and returns a tuple (h, h), where h is the updated hidden state of the GRU cell.

Examples

julia> using GraphNeuralNetworks, Flux

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);

julia> x = [rand(Float32, d_in, num_nodes) for t in 1:timesteps];

julia> cell = GConvGRUCell(d_in => d_out, 2);

julia> state = Flux.initialstates(cell);

julia> y = state;

julia> for xt in x
           y, state = cell(g, xt, state)
       end

julia> size(y) # (d_out, num_nodes)
(3, 5)
source
GraphNeuralNetworks.GConvLSTMCellType
GConvLSTMCell(in => out, k; [bias, init])

Graph Convolutional LSTM recurrent cell from the paper Structured Sequence Modeling with Graph Convolutional Recurrent Networks.

Uses ChebConv to model spatial dependencies, followed by a Long Short-Term Memory (LSTM) cell to model temporal dependencies.

Arguments

  • in => out: A pair where in is the number of input node features and out is the number of output node features.
  • k: Chebyshev polynomial order.
  • bias: Add learnable bias. Default true.
  • init: Weights' initializer. Default glorot_uniform.

Forward

cell(g::GNNGraph, x, [state])
  • g: The input graph.
  • x: The node features. It should be a matrix of size in x num_nodes.
  • state: The current state of the LSTM cell. If given, it is a tuple (h, c) where both h and c are arrays of size out x num_nodes. If not provided, it is assumed to be a tuple of matrices of zeros.

Performs one recurrence step and returns a tuple (output, state), where output is the updated hidden state h of the LSTM cell and state is the updated tuple (h, c).

Examples

julia> using GraphNeuralNetworks, Flux

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);

julia> x = [rand(Float32, d_in, num_nodes) for t in 1:timesteps];

julia> cell = GConvLSTMCell(d_in => d_out, 2);

julia> state = Flux.initialstates(cell);

julia> y = state[1];

julia> for xt in x
           y, state = cell(g, xt, state)
       end

julia> size(y) # (d_out, num_nodes)
(3, 5)
source
GraphNeuralNetworks.GNNRecurrenceType
GNNRecurrence(cell)

Construct a recurrent layer that applies the graph recurrent cell forward multiple times to process an entire temporal sequence of node features at once.

The cell has to satisfy the following interface for the forward pass: yt, state = cell(g, xt, state), where xt is the input node features, yt is the updated node features, state is the cell state to be updated.

Forward

layer(g, x, [state])

Applies the recurrent cell to each timestep of the input sequence.

Arguments

  • g: The input GNNGraph or TemporalSnapshotsGNNGraph.
    • If GNNGraph, the same graph is used for all timesteps.
    • If TemporalSnapshotsGNNGraph, a different graph is used for each timestep. Not all cells support this.
  • x: The time-varying node features.
    • If g is GNNGraph, it is an array of size in x timesteps x num_nodes.
    • If g is TemporalSnapshotsGNNGraph, it is an vector of length timesteps, with element t of size in x num_nodes_t.
  • state: The initial state for the cell. If not provided, it is generated by calling Flux.initialstates(cell).

Return

Returns the updated node features:

  • If g is GNNGraph, returns an array of size out_features x timesteps x num_nodes.
  • If g is TemporalSnapshotsGNNGraph, returns a vector of length timesteps, with element t of size out_features x num_nodes_t.

Examples

The following example considers a static graph and a time-varying node features.

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);
GNNGraph:
  num_nodes: 5
  num_edges: 10

julia> x = rand(Float32, d_in, timesteps, num_nodes);

julia> cell = GConvLSTMCell(d_in => d_out, 2)
GConvLSTMCell(2 => 3, 2)  # 168 parameters

julia> layer = GNNRecurrence(cell)
GNNRecurrence(
  GConvLSTMCell(2 => 3, 2),             # 168 parameters
)                   # Total: 24 arrays, 168 parameters, 2.023 KiB.

julia> y = layer(g, x);

julia> size(y) # (d_out, timesteps, num_nodes)
(3, 5, 5)

Now consider a time-varying graph and time-varying node features.

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> num_nodes = [10, 10, 10, 10, 10];

julia> num_edges = [10, 12, 14, 16, 18];

julia> snapshots = [rand_graph(n, m) for (n, m) in zip(num_nodes, num_edges)];

julia> tg = TemporalSnapshotsGNNGraph(snapshots)

julia> x = [rand(Float32, d_in, n) for n in num_nodes];

julia> cell = EvolveGCNOCell(d_in => d_out)
EvolveGCNOCell(2 => 3)  # 321 parameters

julia> layer = GNNRecurrence(cell)
GNNRecurrence(
  EvolveGCNOCell(2 => 3),               # 321 parameters
)                   # Total: 5 arrays, 321 parameters, 1.535 KiB.

julia> y = layer(tg, x);

julia> length(y)    # timesteps
5

julia> size(y[end]) # (d_out, num_nodes[end])
(3, 10)
source
GraphNeuralNetworks.TGCNCellType
TGCNCell(in => out; kws...)

Recurrent graph convolutional cell from the paper T-GCN: A Temporal Graph Convolutional Network for Traffic Prediction.

Uses two stacked GCNConv layers to model spatial dependencies, and a GRU mechanism to model temporal dependencies.

in and out are the number of input and output node features, respectively. The keyword arguments are passed to the GCNConv constructor.

Forward

cell(g::GNNGraph, x, [state])
  • g: The input graph.
  • x: The node features. It should be a matrix of size in x num_nodes.
  • state: The current state of the cell. If not provided, it is generated by calling Flux.initialstates(cell). The state is a matrix of size out x num_nodes.

Returns the updated node features and the updated state.

Examples

julia> using GraphNeuralNetworks, Flux

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);

julia> x = [rand(Float32, d_in, num_nodes) for t in 1:timesteps];

julia> cell = DCGRUCell(d_in => d_out, 2);

julia> state = Flux.initialstates(cell);

julia> y = state;

julia> for xt in x
           y, state = cell(g, xt, state)
       end

julia> size(y) # (d_out, num_nodes)
(3, 5)
source
GraphNeuralNetworks.DCGRUMethod
DCGRU(args...; kws...)

Construct a recurrent layer corresponding to the DCGRUCell cell. It can be used to process an entire temporal sequence of node features at once.

The arguments are passed to the DCGRUCell constructor. See GNNRecurrence for more details.

Examples

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);

julia> x = rand(Float32, d_in, timesteps, num_nodes);

julia> layer = DCGRU(d_in => d_out, 2)
GNNRecurrence(
  DCGRUCell(2 => 3, 2),                 # 189 parameters
)                   # Total: 6 arrays, 189 parameters, 1.184 KiB.

julia> y = layer(g, x);

julia> size(y) # (d_out, timesteps, num_nodes)
(3, 5, 5)
source
GraphNeuralNetworks.EvolveGCNOMethod
EvolveGCNO(args...; kws...)

Construct a recurrent layer corresponding to the EvolveGCNOCell cell. It can be used to process an entire temporal sequence of graphs and node features at once.

The arguments are passed to the EvolveGCNOCell constructor. See GNNRecurrence for more details.

Examples

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> num_nodes = [10, 10, 10, 10, 10];

julia> num_edges = [10, 12, 14, 16, 18];

julia> snapshots = [rand_graph(n, m) for (n, m) in zip(num_nodes, num_edges)];

julia> tg = TemporalSnapshotsGNNGraph(snapshots)

julia> x = [rand(Float32, d_in, n) for n in num_nodes];

julia> cell = EvolveGCNO(d_in => d_out)
GNNRecurrence(
  EvolveGCNOCell(2 => 3),               # 321 parameters
)                   # Total: 5 arrays, 321 parameters, 1.535 KiB.

julia> y = layer(tg, x);

julia> length(y)    # timesteps
5 

julia> size(y[end]) # (d_out, num_nodes[end])
(3, 10)
source
GraphNeuralNetworks.GConvGRUMethod
GConvGRU(args...; kws...)

Construct a recurrent layer corresponding to the GConvGRUCell cell. It can be used to process an entire temporal sequence of node features at once.

The arguments are passed to the GConvGRUCell constructor. See GNNRecurrence for more details.

Examples

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);

julia> x = rand(Float32, d_in, timesteps, num_nodes);

julia> layer = GConvGRU(d_in => d_out, 2)
GConvGRU(
  GConvGRUCell(2 => 3, 2),              # 108 parameters
)                   # Total: 12 arrays, 108 parameters, 1.148 KiB.

julia> y = layer(g, x);

julia> size(y) # (d_out, timesteps, num_nodes)
(3, 5, 5)
source
GraphNeuralNetworks.GConvLSTMMethod
GConvLSTM(args...; kws...)

Construct a recurrent layer corresponding to the GConvLSTMCell cell. It can be used to process an entire temporal sequence of node features at once.

The arguments are passed to the GConvLSTMCell constructor. See GNNRecurrence for more details.

Examples

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);

julia> x = rand(Float32, d_in, timesteps, num_nodes);

julia> layer = GConvLSTM(d_in => d_out, 2)
GNNRecurrence(
  GConvLSTMCell(2 => 3, 2),             # 168 parameters
)                   # Total: 24 arrays, 168 parameters, 2.023 KiB.

julia> y = layer(g, x);

julia> size(y) # (d_out, timesteps, num_nodes)
(3, 5, 5)
source
GraphNeuralNetworks.TGCNMethod
TGCN(args...; kws...)

Construct a recurrent layer corresponding to the TGCNCell cell.

The arguments are passed to the TGCNCell constructor. See GNNRecurrence for more details.

Examples

julia> num_nodes, num_edges = 5, 10;

julia> d_in, d_out = 2, 3;

julia> timesteps = 5;

julia> g = rand_graph(num_nodes, num_edges);

julia> x = rand(Float32, d_in, timesteps, num_nodes);

julia> layer = TGCN(d_in => d_out)

julia> y = layer(g, x);

julia> size(y) # (d_out, timesteps, num_nodes)
(3, 5, 5)
source