GNN

1) Find graph structure

Stuctured

Unstructured

2) Specify graph type and scale

Directed/Undirected Graphs (all directed from one node to another)

Homogeneous/Heterogeneous Graphs (nodes and edges in homogeneous graphs have same types)

Static/Dynamic Graphs (input features or the topology of the graph vary with time)

3) Design loss function

Edge-level (edge classification and link prediction -exist?-

Graph-level (graph classification, graph regression, and graph matching)

Node-level (node classification, node regression, node clustering)

click to edit

click to edit

4) training setting

Semi-supervised setting

Unsupervised setting

Supervised setting

click to edit

Build model using computational modules

Sampling Module (When graphs are large, sampling modules are usually needed to conduct propagation on graphs.)

Pooling Module (xtract information from nodes to have a high-level representation of the graph or subgraph)

Propagation Module

recurrent operator (aggregate information from neighbors)

convolution operator (aggregate information from neighbors)

skip connection (aggregate information from neighbors)

The major difference between recurrent operators and convolution operators is that layers in convolution operators use different weights while layers in recurrent operators share same weights.

more layers could also propagate the noisy information from an exponentially increasing number of expanded neighborhood members. It also causes the over smoothing problem because nodes tend to have similar representations after the aggregation operation when models go deeper.

Node Sampling

Layer sampling

subgraph samplig

selecting a subset from each node’s neighborhood.

layer sampling retains a small set of nodes for aggregation in each layer to control the expansion factor

sample multiple subgraphs and restrict the neighborhood search within these subgraphs. (region)

Direct pooling

larn graph-level representations directly from nodes with different node selection strategies.

Hierarchical pooling modules

between nodes

other graph types

Dynamics graphes

graph structure, e.g. the existence of edges and nodes, keeps changing over time.

signed graphs : graphs with signed edges, i.e. an edge can be either positive or negative)

Large graph :

Leveraging approximate personalized PageRank, methods proposed by Klicpera et al. (2019) and Bojchevski et al. (2020) avoid calculating high-order propagation matrices. Rossi et al. (2020) propose a method to precompute graph convolutional filters of different sizes for efficient training and inference.

different training settings

Auto-encoders

compute the loss from the similarity between the original adjacency matrix and the reconstructed matrix:

Adversarially Regularized Graph Auto-encoder