Skip to content

XavierGu1995/unilm

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

88 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

UniLM

We develop pre-trained models for natural language understanding (NLU) and generation (NLG) tasks

***** New February, 2020: UniLM v2 | MiniLM v1 | LayoutLM v1 | s2s-ft v1 release *****

The family of UniLM:

UniLM: unified pre-training for language understanding and generation

MiniLM (new): small pre-trained models for language understanding and generation

LayoutLM (new): multimodal (text + layout/format + image) pre-training for document understanding (e.g. scanned documents, PDF, etc.)

s2s-ft (new): sequence-to-sequence fine-tuning toolkit

Update in this release:

***** October 1st, 2019: UniLM v1 release *****

License

This project is licensed under the license found in the LICENSE file in the root directory of this source tree. Portions of the source code are based on the transformers project.

Microsoft Open Source Code of Conduct

Contact Information

For help or issues using UniLM, please submit a GitHub issue.

For other communications related to UniLM, please contact Li Dong (lidong1@microsoft.com), Furu Wei (fuwei@microsoft.com).

About

UniLM - Unified Language Model Pre-training

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 100.0%