Fork me on GitHub

Softmax encounters large computing cost when the output vocabulary size is very large. Some feasible approaches will be explained under the circumstance of skip-gram pretraining task.

Read more »

Dynamic Programming (DP) is ubiquitous in NLP, such as Minimum Edit Distance, Viterbi Decoding, forward/backward algorithm, CKY algorithm, etc.

Read more »

upload successful

The main aim of conv op is to extract useful features for downstream tasks. And different filters could intuitionally extract different aspect of features via backprop during training. Afterward, all the extracted features are combined to make decisions.

Read more »