論文

査読有り 国際誌
2020年7月

Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction.

Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL-2020)
  • Masahiro Kaneko
  • ,
  • Masato Mita
  • ,
  • Shun Kiyono
  • ,
  • Jun Suzuki
  • ,
  • Kentaro Inui

開始ページ
4248
終了ページ
4254
記述言語
英語
掲載種別
DOI
10.18653/v1/2020.acl-main.391
出版者・発行元
Association for Computational Linguistics

This paper investigates how to effectively incorporate a pre-trained masked
language model (MLM), such as BERT, into an encoder-decoder (EncDec) model for
grammatical error correction (GEC). The answer to this question is not as
straightforward as one might expect because the previous common methods for
incorporating a MLM into an EncDec model have potential drawbacks when applied
to GEC. For example, the distribution of the inputs to a GEC model can be
considerably different (erroneous, clumsy, etc.) from that of the corpora used
for pre-training MLMs; however, this issue is not addressed in the previous
methods. Our experiments show that our proposed method, where we first
fine-tune a MLM with a given GEC corpus and then use the output of the
fine-tuned MLM as additional features in the GEC model, maximizes the benefit
of the MLM. The best-performing model achieves state-of-the-art performances on
the BEA-2019 and CoNLL-2014 benchmarks. Our code is publicly available at:
https://github.com/kanekomasahiro/bert-gec.

リンク情報
DOI
https://doi.org/10.18653/v1/2020.acl-main.391
DBLP
https://dblp.uni-trier.de/rec/conf/acl/KanekoMKSI20
arXiv
http://arxiv.org/abs/arXiv:2005.00987
URL
https://www.aclweb.org/anthology/2020.acl-main.391/
URL
https://dblp.uni-trier.de/conf/acl/2020
URL
https://dblp.uni-trier.de/db/conf/acl/acl2020.html#KanekoMKSI20
ID情報
  • DOI : 10.18653/v1/2020.acl-main.391
  • DBLP ID : conf/acl/KanekoMKSI20
  • arXiv ID : arXiv:2005.00987

エクスポート
BibTeX RIS