Transformer


개요

논문 ‘Attention is All You Need’에서 설명하는 Transformer을 구현해 볼 것이다. 전체적인 내용은 다음 글을 참고 했습니다.

  • 목차
    • Preparing Data
    • Encoder

https://github.com/bentrevett/pytorch-seq2seq/blob/master/1%20-%20Sequence%20to%20Sequence%20Learning%20with%20Neural%20Networks.ipynb




© 2021.06. by ekspertos

Powered by theorydb