Crosslingual Transfer Learning for Arabic Story Ending Generation

Authors

DOI:

https://doi.org/10.33022/ijcs.v13i2.3831

Keywords:

Arabic, Story ending generation, crosslingual transfer learning

Abstract

In the field of natural language processing, the task of generating story endings (SEG) requires not only a deep understanding of the narrative context but also the ability to formulate coherent conclusions. This study delves into the use of crosslingual transfer learning to address the challenges posed by the scarcity of Arabic data in SEG, proposing the utilization of extensive English story corpora as a solution. We evaluated the efficacy of multilingual models, such as mBART, mT5, and mT0, in generating Arabic story endings, assessing their performance in both zero-shot and few-shot scenarios. Despite the linguistic complexities of Arabic and the inherent challenges of crosslingual transfer, our findings demonstrate the potential of these multilingual models to transcend linguistic barriers, significantly contributing to the domain of natural language processing across different languages. This research has significant implications for generating creative text and improving multilingual natural language processing in resource-limited language contexts

Author Biography

Arwa Alhussain, King Saud University

She earned her Bachelor's degree in Computer Applications and her Master's degree in Computer Science, both from King Saud University in Riyadh, Saudi Arabia. She is currently pursuing her PhD in Computer Science. Her research focuses on natural language processing, with a particular interest in Arabic, and the application of deep learning techniques.

Downloads

Published

01-04-2024