상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
학술저널

Investigating Syntactic Transfer from English to Korean in Neural L2 Language Models

Investigating Syntactic Transfer from English to Korean in Neural L2 Language Models

  • 19
현대문법연구 제121호.jpg

This paper investigates how the grammatical knowledge obtained in the initial language (English) of neural language models (LMs) influences the learning of grammatical structures in their second language (Korean). To achieve this objective, we conduct the now well- established experimental procedure, including (i) pre-training transformer-based GPT-2 LMs with Korean and English datasets, (ii) further fine-tuning them with a specific set of Korean data as L1 or L2, and (iii) evaluating them with the test data of KBLiMP while analyzing their linguistic generalization in L1 or L2. We have found negative transfer effects in the comparison between English as L1 and Korean as L2. Furthermore, in the trajectory analysis, the second language-learning LM has captured linguistic features of Korean including syntax, syntax-semantics interface, and morphology during the progressive training step. Our study of second language learning in LMs contributes to predicting potential syntactic challenges arising from the interference by the L1 language during the learning of Korean as a foreign language.

1. Introduction

2. Previous Studies

3. Second Language Acquisition of LMs

4. Experimental Results

5. Discussion

6. Conclusion

References

(0)

(0)

로딩중