-
Notifications
You must be signed in to change notification settings - Fork 3
BERT
ugurcanarikan edited this page Feb 23, 2019
·
3 revisions
BERT, or Bidirectional Encoder Representations from Transformers, is a new method of pre-training language representations which obtains state-of-the-art results on a wide array of Natural Language Processing (NLP) tasks.