Crosslingual Transfer Learning for Arabic Story Ending Generation
DOI:
https://doi.org/10.33022/ijcs.v13i2.3831Kata Kunci:
Arabic, Story ending generation, crosslingual transfer learningAbstrak
In the field of natural language processing, the task of generating story endings (SEG) requires not only a deep understanding of the narrative context but also the ability to formulate coherent conclusions. This study delves into the use of crosslingual transfer learning to address the challenges posed by the scarcity of Arabic data in SEG, proposing the utilization of extensive English story corpora as a solution. We evaluated the efficacy of multilingual models, such as mBART, mT5, and mT0, in generating Arabic story endings, assessing their performance in both zero-shot and few-shot scenarios. Despite the linguistic complexities of Arabic and the inherent challenges of crosslingual transfer, our findings demonstrate the potential of these multilingual models to transcend linguistic barriers, significantly contributing to the domain of natural language processing across different languages. This research has significant implications for generating creative text and improving multilingual natural language processing in resource-limited language contexts
Unduhan
Diterbitkan
Terbitan
Bagian
Lisensi
Hak Cipta (c) 2024 Arwa Alhussain, Aqil Azmi
Artikel ini berlisensiCreative Commons Attribution-ShareAlike 4.0 International License.