#ai #computing
# [[Epistemic status]]
#shower-thought
# Graph neural networks
#to-digest
![[Screenshot 2022-03-13 at 10.20.34.png]]
![[Screenshot 2022-03-13 at 10.21.04.png]]
>Graph Neural Networks (GNNs) or Graph Convolutional Networks (GCNs) build representations of nodes and edges in graph data. They do so through **neighbourhood aggregation** (or message passing), where each node gathers features from its neighbours to update its representation of the _local_ graph structure around it. Stacking several GNN layers enables the model to propagate each node's features over the entire graph—from its neighbours to the neighbours' neighbours, and so on.
[^joshi2020transformers]
![[Screenshot 2022-03-16 at 08.45.48.png]]
## Three flavours of GNN layers
![[Screenshot 2022-03-15 at 07.40.36.png]]
## Message passing
![[Pasted image 20220315074021.png]]
## In NLP
![[Pasted image 20220316085410.png]]
>Another issue with fully-connected graphs is that they make **learning very long-term dependencies between words difficult**
>[^joshi2020transformers]
## Using [[Reinforcement Learning]]
![[1FA15A9D-CF9F-4796-B43F-AF24F3D50749.jpeg]]
![[D0059E46-E913-45DB-9CA8-3B55D3E65F2C.png]]
![[B69DBB14-49CE-4567-917B-DC977DDA9F40.png]]
![[8CA8CC69-073B-432E-971F-C52D0E02D94A.png]]
![[29828FA7-6AD9-4CF1-85AC-591FBD5E34C0.png]]
![[BF4D23E7-6DBD-4936-AF0D-4662B7288033.png]]
![[C810E2CB-E75D-41DE-92A5-80F936C72AFA.png]]
![[2E456A9B-B53A-4AF1-8F2B-7C13355B08A1.png]]
![[8F4BDAB8-A264-4A60-886F-00380C0601DA.png]]
![[BBE1E5EB-8837-4539-A328-51E18F5ED23F.png]]
# External links
- https://deepmind.com/blog/article/traffic-prediction-with-advanced-graph-neural-networks
[^joshi2020transformers]: https://thegradient.pub/transformers-are-graph-neural-networks/