5,321
edits
No edit summary |
No edit summary |
||
Line 12: | Line 12: | ||
* Activation (typically ReLU or some variant). | * Activation (typically ReLU or some variant). | ||
More traditionally | More traditionally, convolutional blocks came in blocks of two conv layers | ||
* Conv2D layer. | * Conv2D layer. | ||
* Activation. | * Activation. |