상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
영어학연구 제30권 1호.jpg
KCI등재 학술저널

Investigating Syntactic Interference Effects in Neural Language Models for Second Language Acquisition

Investigating Syntactic Interference Effects in Neural Language Models for Second Language Acquisition

DOI : 10.17960/ell.2024.30.1.004
  • 8

This paper explores the intricate dynamics of cross-linguistic transfer in second language acquisition (SLA) research, investigating how the linguistic structure of a native language influences the acquisition of a second language. The study employs transfer learning as a methodology, using the Transformer-based language model, BabyRoberta. The model undergoes pre-training on Korean as the first language (L1) and subsequent fine-tuning with English as the second language (L2). Evaluation involves the BLiMP test suite, a benchmark for assessing syntactic abilities in neural language models. The primary focus is on unraveling transfer effects originating from the native language’s linguistic structure. By examining the model’s performance on syntactic tasks, particularly in the context of English, the research aims to provide insights into how neural language models encode abstract syntactic structures, incorporating inductive biases acquired during pre-training on Korean. This approach not only contributes to the understanding of cross-linguistic influences in SLA but also tries figure interplay between native and second languages within neural language models.

1. Introduction

2. Previous Literature

3. Experiments

4. Results and Discussion

5. Concluding Remarks

References

로딩중