상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
162695.jpg
KCI등재 학술저널

What Can BERT Do for English Linguistics?: A Case Study of Examining It-cleft Constructions

DOI : 10.17960/ell.2022.28.1.007
  • 42

This study used the machine learning technique known as Bidirectional Encoder Representations from Transformers (BERT) to understand the pronominal distribution in it-cleft constructions. While language models (LMs) have successfully been used to examine and verify linguistic representations, most studies have focused on the general phenomenon of subject-verb agreement. As such, this study focused on the particular construction known as it-cleft, in order to consolidate the LM’s role as a linguistic platform. In the experiment using BERT, the surprisal of each case (i.e., nominative and accusative) is compared. The investigation successfully replicated the findings of previous studies, specifically by capturing the use of the accusative form, inhibition of the first-person pronominal, and collocate feature of the first-person pronominal with the complementizer who. The results from the LM also deliver further contribution to the study of it-cleft, involving the between number and the relationship between the pronominal and syntactic role. Thus, the usage of BERT both revealed new research questions while verifying evidence from previous studies on the it-cleft pronominal. This demonstrates that BERT can correctly represent particular constructions and broadens the possibilities of LMs as a linguistic platform.

1. Introduction

2. Background

3. Methods

4. Results

5. Discussion

6. Conclusion

References

로딩중