• GCN
  • GAT
  • GraphSAGE
  • GIN
hv(0)\\color{#FE6100} h_v^{(0)}hv(0)​ === xvx_vxv​
Node vvv's initial embedding.
... is just node vvv's original features.
for all v∈V.v \in V.v∈V.
and for k=1,2,…k = 1, 2, \ldots k=1,2,… upto K KK:
hv(k)\\color{#FE6100} h_v^{(k)}hv(k)​ ===
  • GCN
  • GAT
  • GraphSAGE
  • GIN
hv(0)\\color{#FE6100} h_v^{(0)}hv(0)​ === xvx_vxv​
Node vvv's initial embedding.
... is just node vvv's original features.
for all v∈V.v \\in V.v∈V.
and for k=1,2,…k = 1, 2, \\ldots k=1,2,… upto K KK:
hv(k)\\color{#FE6100} h_v^{(k)}hv(k)​ === f(k)(W(k)⋅∑u∈N(v)hu(k−1)∣N(v)∣+B(k)⋅hv(k−1)){\\color{#4D9DB5} f^{(k)}}\\left({\\color{#4D9DB5} W^{(k)}} \\cdot \\frac{\\sum\\limits_{u \\in \\mathcal{N}(v)} {\\color{#A95AA1} h_u^{(k - 1)}}}{|\\mathcal{N}(v)|} + {\\color{#4D9DB5} B^{(k)}} \\cdot {\\color{#FE6100} h_v^{(k - 1)}}\\right)f(k)⎝⎜⎜⎜⎛​W(k)⋅∣N(v)∣u∈N(v)∑​hu(k−1)​​+B(k)⋅hv(k−1)​⎠⎟⎟⎟⎞​
Node vvv's embedding at step kkk.
Mean of vvv's neighbour's embeddings at step k−1k - 1k−1.
Node vvv's embedding at step k−1k - 1k−1.
for all v∈V.v \\in V.v∈V.
Color Codes:
  • Embedding of node vvv.
  • Embedding of a neighbour of node vvv.
  • (Potentially) Learnable parameters.
hv(0)\\color{#FE6100} h_v^{(0)}hv(0)​ === xvx_vxv​
Node vvv's initial embedding.
... is just node vvv's original features.
for all v∈V.v \\in V.v∈V.
and for k=1,2,…k = 1, 2, \\ldots k=1,2,… upto K KK:
hv(k)\\color{#FE6100} h_v^{(k)}hv(k)​ === f(k)(W(k)⋅[∑u∈N(v)αvu(k−1)hu(k−1)+αvv(k−1)hv(k−1)]){\\color{#4D9DB5} f^{(k)}}\\left({\\color{#4D9DB5} W^{(k)}} \\cdot \\left[\\sum\\limits_{u \\in \\mathcal{N}(v)} \\alpha_{vu}^{(k - 1)} {\\color{#A95AA1} h_u^{(k - 1)}} + \\alpha_{vv}^{(k - 1)} {\\color{#FE6100} h_v^{(k - 1)}}\\right]\\right)f(k)⎝⎜⎛​W(k)⋅⎣⎢⎡​u∈N(v)∑​αvu(k−1)​hu(k−1)​+αvv(k−1)​hv(k−1)​⎦⎥⎤​⎠⎟⎞​
Node vvv's embedding at step kkk.
Weighted mean of vvv's neighbour's embeddings at step k−1k - 1k−1.
Node vvv's embedding at step k−1k - 1k−1.
for all v∈V.v \\in V.v∈V.
where the attention weights α(k)\\alpha^{(k)}α(k) are generated by an attention mechanism A(k){\\color{#4D9DB5} A^{(k)}}A(k), normalized such that the sum over all neighbours of each node vvv is 111:
αvu(k)\\alpha_{vu}^{(k)}αvu(k)​ === A(k)(hv(k),hu(k))∑w∈N(v)A(k)(hv(k),hw(k))\\frac{{\\color{#4D9DB5} A^{(k)}}({\\color{#FE6100} h_v^{(k)}}, {\\color{#A95AA1} h_u^{(k)}})}{\\sum\\limits_{w \\in \\mathcal{N}(v)}{\\color{#4D9DB5} A^{(k)}}({\\color{#FE6100} h_v^{(k)}}, {\\color{#A95AA1} h_w^{(k)}})}w∈N(v)∑​A(k)(hv(k)​,hw(k)​)A(k)(hv(k)​,hu(k)​)​
for all (v,u)∈E.(v, u) \\in E.(v,u)∈E.
Color Codes:
  • Embedding of node vvv.
  • Embedding of a neighbour of node vvv.
  • (Potentially) Learnable parameters.
hv(0)\\color{#FE6100} h_v^{(0)}hv(0)​ === xvx_vxv​
Node vvv's initial embedding.
... is just node vvv's original features.
for all v∈V.v \\in V.v∈V.
and for k=1,2,…k = 1, 2, \\ldots k=1,2,… upto K KK:
hv(k)\\color{#FE6100} h_v^{(k)}hv(k)​ === f(k)(∑u∈N(v)hu(k−1)+(1+ϵ(k))⋅hv(k−1)){\\color{#4D9DB5} f^{(k)}}\\left(\\sum\\limits_{u \\in \\mathcal{N}(v)} {\\color{#A95AA1} h_u^{(k - 1)}} + (1 + {\\color{#4D9DB5} \\epsilon^{(k)}}) \\cdot {\\color{#FE6100} h_v^{(k - 1)}}\\right)f(k)⎝⎜⎛​u∈N(v)∑​hu(k−1)​+(1+ϵ(k))⋅hv(k−1)​⎠⎟⎞​
Node vvv's embedding at step kkk.
Sum of vvv's neighbour's embeddings at step k−1k - 1k−1.
Node vvv's embedding at step k−1k - 1k−1.
for all v∈V.v \\in V.v∈V.
Color Codes:
  • Embedding of node vvv.
  • Embedding of a neighbour of node vvv.
  • (Potentially) Learnable parameters.
hv(0)\\color{#FE6100} h_v^{(0)}hv(0)​ === xvx_vxv​
Node vvv's initial embedding.
... is just node vvv's original features.
for all v∈V.v \\in V.v∈V.
and for k=1,2,…k = 1, 2, \\ldots k=1,2,… upto K KK: