#ai
# [[Epistemic status]]
#shower-thought
# Neural networks
## Backpropagration complexity
![[Pasted image 20221116101055.png]]
A good way to think about backpropagation is that its algorithmic complexity is $O(mn)$,$O(mn)$, where $m$ is the number of neurons in this layer and $n$ is the number of neurons in the next layer. One could also show that this is equivalent to $O(W)$,$O(W)$, where $W$ is the number of synapses in the network.
# External links