Fork me on GitHub

Softmax encounters large computing cost when the output vocabulary size is very large. Some feasible approaches will be explained under the circumstance of skip-gram pretraining task.

Read more »

FLOPs is a measure of model complexity in deep learning.

FLOPs

Floating point operations (FLOPs) measures the complexity of neural models.

Assume convolution is implemented as a sliding window and that the nonlinearity function is computed for free.

For convolutional kernels, we have:

where , , and are height, width, and number of channels of the input feature map, is the kernel width (assumed to be symmetric), and is the number of output channels.

For MLP layers, we have:

where is the input dimensionality and is the output dimensionality.

  • FLOPs is abbreviation of floating operations which includes mul/add/div,…,etc.
  • MACs stands for multiply-accumulate operation that performs .

References

[1] Molchanov, Pavlo, et al. “Pruning convolutional neural networks for resource efficient inference.” arXiv preprint arXiv:1611.06440 (2016).

Dynamic Programming (DP) is ubiquitous in NLP, such as Minimum Edit Distance, Viterbi Decoding, forward/backward algorithm, CKY algorithm, etc.

Read more »

upload successful

The main aim of conv op is to extract useful features for downstream tasks. And different filters could intuitionally extract different aspect of features via backprop during training. Afterward, all the extracted features are combined to make decisions.

Read more »