Transfer learning of argument mining in student essays

Conference contribution (Article)ResearchPeer reviewed

Publication data


ByYuning Ding, Julian Lohmann, Nils-Jonathan Schaller, Thorben Jansen, Andrea Horbach
Original languageEnglish
Published inEkaterina Kochmar, Marie Bexte, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Anaïs Tack, Victoria Yaneva, Zheng Yuan (Eds.), Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024)
Pages439-449
Editor (Publisher)Association for Computational Linguistics
ISBN979-8-89176-100-1
DOI/Linkhttps://aclanthology.org/2024.bea-1.36/ (Open Access)
Publication statusPublished – 06.2024

This paper explores the transferability of a cross-prompt argument mining model trained on argumentative essays authored by native English-speaking learners (EN-L1) across educational contexts and languages. Specifically, the adaptability of a multilingual transformer model is assessed through its application to comparable argumentative essays authored by English-as-a-foreign-language learners (EN-L2) for context transfer, and a dataset composed of essays written by native German learners (DE) for both language and task transfer. To separate language effects from educational context effects, we also perform experiments on a machine-translated version of the German dataset (DE-MT). Our findings demonstrate that, even under zero-shot conditions, a model trained on native English speakers exhibits satisfactory performance on the EN-L2/DE datasets. Machine translation does not substantially enhance this performance, suggesting that distinct writing styles across educational contexts impact performance more than language differences.