Improving the Performance of Lifelong Language Learning via Skip Connection and Data Augmentation 


Vol. 14,  No. 4, pp. 231-238, Apr.  2025
https://doi.org/10.3745/TKIPS.2025.14.4.231


PDF
  Abstract

Lifelong language learning refers to a learning method that continuously learns tasks and maintains previous knowledge. But the time delay that occurs can cause continuous resource consumption, which slows the update rate and reduces the adaptability of the model. In this study, we propose SDAL(Skip connection and Data Augmentation beyond LAMOL), a model that applies skip connection and data augmentation to the pseudo-sample generation encoder in LAMOL(Language Modeling for Lifelong Language Learning), a type of lifelong language learning method. SDAL aims to efficiently use learning time and improve learning quality by reducing pseudo-sample generation time via skip connection and increasing learning accuracy via data augmentation. SDAL improves learning quality by quickly transferring information between layers through skip connections and learning more data sets without additional input with data augmentation. Experimental results showed that SDAL reduced learning time by up to 2.3% and average by 2.0%, and increased accuracy by up to 5.1% and average by 2.7% compared to LAMOL. Additionally, SDAL reduced time by up to 9.1% and average by 6.0%, and improved performance by up to 3.3% and average by 1.4% over L2KD.

  Statistics


  Cite this article

[IEEE Style]

J. Y. Park and R. Ha, "Improving the Performance of Lifelong Language Learning via Skip Connection and Data Augmentation," The Transactions of the Korea Information Processing Society, vol. 14, no. 4, pp. 231-238, 2025. DOI: https://doi.org/10.3745/TKIPS.2025.14.4.231.

[ACM Style]

Ji Yeon Park and Rhan Ha. 2025. Improving the Performance of Lifelong Language Learning via Skip Connection and Data Augmentation. The Transactions of the Korea Information Processing Society, 14, 4, (2025), 231-238. DOI: https://doi.org/10.3745/TKIPS.2025.14.4.231.