Transfer learning of argument mining in student essays
Conference contribution (Article) › Research › Peer reviewed
Publication data
By | Yuning Ding, Julian Lohmann, Nils-Jonathan Schaller, Thorben Jansen, Andrea Horbach |
Original language | English |
Published in | Ekaterina Kochmar, Marie Bexte, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Anaïs Tack, Victoria Yaneva, Zheng Yuan (Eds.), Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024) |
Pages | 439-449 |
Editor (Publisher) | Association for Computational Linguistics |
ISBN | 979-8-89176-100-1 |
DOI/Link | https://aclanthology.org/2024.bea-1.36/ |
Publication status | Published – 06.2024 |
This paper explores the transferability of a cross-prompt argument mining model trained on argumentative essays authored by native English-speaking learners (EN-L1) across educational contexts and languages. Specifically, the adaptability of a multilingual transformer model is assessed through its application to comparable argumentative essays authored by English-as-a-foreign-language learners (EN-L2) for context transfer, and a dataset composed of essays written by native German learners (DE) for both language and task transfer. To separate language effects from educational context effects, we also perform experiments on a machine-translated version of the German dataset (DE-MT). Our findings demonstrate that, even under zero-shot conditions, a model trained on native English speakers exhibits satisfactory performance on the EN-L2/DE datasets. Machine translation does not substantially enhance this performance, suggesting that distinct writing styles across educational contexts impact performance more than language differences.