Graph neural network

From David's Wiki
Revision as of 21:29, 10 June 2022 by David (talk | contribs)
\( \newcommand{\P}[]{\unicode{xB6}} \newcommand{\AA}[]{\unicode{x212B}} \newcommand{\empty}[]{\emptyset} \newcommand{\O}[]{\emptyset} \newcommand{\Alpha}[]{Α} \newcommand{\Beta}[]{Β} \newcommand{\Epsilon}[]{Ε} \newcommand{\Iota}[]{Ι} \newcommand{\Kappa}[]{Κ} \newcommand{\Rho}[]{Ρ} \newcommand{\Tau}[]{Τ} \newcommand{\Zeta}[]{Ζ} \newcommand{\Mu}[]{\unicode{x039C}} \newcommand{\Chi}[]{Χ} \newcommand{\Eta}[]{\unicode{x0397}} \newcommand{\Nu}[]{\unicode{x039D}} \newcommand{\Omicron}[]{\unicode{x039F}} \DeclareMathOperator{\sgn}{sgn} \def\oiint{\mathop{\vcenter{\mathchoice{\huge\unicode{x222F}\,}{\unicode{x222F}}{\unicode{x222F}}{\unicode{x222F}}}\,}\nolimits} \def\oiiint{\mathop{\vcenter{\mathchoice{\huge\unicode{x2230}\,}{\unicode{x2230}}{\unicode{x2230}}{\unicode{x2230}}}\,}\nolimits} \)

If you can represent your data as a graph, you can use a graph neural network to perform inference on it.
GNN operate on global graph embeddings or local embeddings in each node or edge in the graph.
Hence, a GNN allows you to output predictions on each node, each edge, or the entire graph.

Introduction

Structure

A graph neural network consists of layers which operate on graphs.
Typically, this means one or more GNN layers to get new embeddings along the graph.
Then a standard MLP can be used to parse each embedding into logits or values.

At each layer, you get the following features during inference:

  • The graph structure
  • Node embeddings
  • Edge embeddings
  • Graph embeddings

Message Passing Layer

A standard GNN layer consists of pooling functions followed by update functions.

Pooling

There are many types of pooling to choose from:

  • For nodes, you can add the node embedding to connected node embedding or connected edge embeddings.
  • Similarly for edges, you can add its embedding to connected node embeddings.
  • For the entire graph, you can add all node embeddings together.

After pooling, you can use update function (e.g. MLP) to update the embeddings/state at each node, edge, and for the entire graph.

Resources