Skip to content

michele1993/Transformer_from_scratch

Repository files navigation

Implementation of a Transformer from scratch

Here I provide my own implementation of the encoder-decoder transformer architecture based on the orginal "Attention is all you need" paper. I train this model based on some random data and also test its inference process. This was done with some help from these two great tutorials 1 and 2. You can also find some of my personal notes about Transformers here.

Figure: Encoder Decoder Transformer

Run

Simply run:

python main.py

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages