Skip to content
ugurcanarikan edited this page Feb 23, 2019 · 3 revisions

BERT, or Bidirectional Encoder Representations from Transformers, is a new method of pre-training language representations which obtains state-of-the-art results on a wide array of Natural Language Processing (NLP) tasks.

Clone this wiki locally