FLOPs is a measure of model complexity in deep learning.

## FLOPs

Floating point operations (FLOPs) measures the complexity of neural models.

Assume convolution is implemented as a sliding window and that the nonlinearity function is computed for free.

For convolutional kernels, we have:

where $H$, $W$, and $C_\textrm{in}$ are height, width, and number of channels of the input feature map, $K$ is the kernel width (assumed to be symmetric), and $C_\textrm{out}$ is the number of output channels.

For MLP layers, we have:

where $I$ is the input dimensionality and $O$ is the output dimensionality.

• FLOPs is abbreviation of floating operations which includes mul/add/div,…,etc.
• MACs stands for multiply-accumulate operation that performs $a \leftarrow a + (b \times c)$.

## References

[1] Molchanov, Pavlo, et al. “Pruning convolutional neural networks for resource efficient inference.” arXiv preprint arXiv:1611.06440 (2016).