Fork me on GitHub

FLOPs Computation

FLOPs is a measure of model complexity in deep learning.

FLOPs

Floating point operations (FLOPs) measures the complexity of neural models.

Assume convolution is implemented as a sliding window and that the nonlinearity function is computed for free.

For convolutional kernels, we have:

where , , and are height, width, and number of channels of the input feature map, is the kernel width (assumed to be symmetric), and is the number of output channels.

For MLP layers, we have:

where is the input dimensionality and is the output dimensionality.

  • FLOPs is abbreviation of floating operations which includes mul/add/div,…,etc.
  • MACs stands for multiply-accumulate operation that performs .

References

[1] Molchanov, Pavlo, et al. “Pruning convolutional neural networks for resource efficient inference.” arXiv preprint arXiv:1611.06440 (2016).