A note of code pre-trained language models (PLMs).
Mask Denoising Strategy for Pre-trained Models
Mask modeling is a crucial role in pre-training language models. This note provides a short summary.
Subword Tokenizers for Pre-trained Models
Summary of word tokenization for pre-trained models.
Scaling Up Pre-trained Models: A Summary
A summary of Large-scale Pre-trained Models (PTMs).
Sequence GANs in a Nutshell
Background: Conventional maximum likelihood approaches for sequence generation with teacher forcing algorithms are inherently prone to exposure bias at the inference stage due to the training-testing discrepancy—the generator produces a sequence iteratively conditioned on its previously predicted ones that may be never observed during training—leading to accumulative mismatch with the increment of generated sequences. In other words, the model is only trained on demonstrated behaviors (real data samples) but not free-running mode.
Generative Adversarial Networks (GANs) hold the promise of mitigating such issues for generating discrete sequences, such as language modeling, speech/music generation, etc.
Automatic Evaluation Metrics for Language Generation
A summary of the automatic evaluation metric for natural language generation (NLG) applications.
The human evaluation considers the aspects of adequacy, fidelity, and fluency, but it is quite expensive.
- Adequacy: Does the output convey the same meaning as the input sentence? Is part of the message lost, added, or distorted?
- Fluency: Is the output good fluent English? This involves both grammatical correctness and idiomatic word choices.
Thus, a useful metric for automatic evaluation in NLG applications holds the promise, such as machine translation, text summarization, image captioning, dialogue generation, poetry/story generation, etc.
Shell Command Notes
A summary of helpful bash command sheets.
Image Captioning: A Summary!
A summary of image-to-text translation.
An Introduction to Capsules
Decoding Methods in Language Generation
Summary of common decoding strategies in language generation.