This is a list of papers which i've read. This repository also contain short notes on their key ideas.
| SNo. | Title | Notes | Link |
|---|---|---|---|
| 1 | XLNet: Generalized Autoregressive Pretraining for Language Understanding | Notes | arxiv |
| 2 | Phrase-Based & Neural Unsupervised Machine Translation | Notes | arxiv |
| 3 | Unsupervised Machine Translation Using Monolingual Corpora Only | Notes | arxiv |
| 4 | Cross-lingual Language Model Pretraining | Notes | arxiv |
| 5 | RoBERTa: A Robustly Optimized BERT Pretraining Approach | Medium Blog | arxiv |
| 6 | Unsupervised Question Answering by Cloze Translation | Notes | arxiv |
| 7 | Unified Language Model Pre-training for Natural Language Understanding and Generation | Notes | arxiv |
| 8 | Attention is all you need | Notes | arxiv |
| 9 | Improving Language Understanding by Generative Pre-Training | Notes | link |
| 10 | Towards a Human-like Open-Domain Chatbot | Notes | link |
| 11 | The Evolved Transformer | Notes | link |
| 12 | ACUTE-EVAL: Improved Dialogue Evaluation with Optimized Questions and Multi-turn Comparisons | Notes | link |
| 13 | Deep Equilibrium Models | Notes | link |
| 14 | Deep contextualized word representations | Notes | link |
| 15 | Regularizing and Optimizing LSTM Language Models | Notes | link |
| 16 | Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context | Notes | link |
| 17 | BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | Notes | link |