Joonwon Jang
/
Paper Review PPT (개인 발표자료)
/
BART- Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension (2020 ACL)
Search
Share
BART- Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension (2020 ACL)
Tags
Transformer
Seq2Seq
BART
PPT
BART- Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension.pdf