상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
학술저널

Fine-Tuning NLLB-200 (600M) for Korean-French Subtitle Translation in Film/Drama: Performance and Reproducible BF16 Training on H100

  • 53
통번역교육연구 제23권 제2호.png

The rapid growth of streaming platforms has increased the demand for high-quality subtitle translation, underscoring the need for domain adaptation in machine translation. This study fine-tuned the NLLB-200- distilled-600M model for Korean-French subtitles using BF16 mixed- precision training on NVIDIA H100 GPUs, with FLORES tags and forced BOS tokens for direction control. Trained on AI-Hub film and drama data with strict non-overlapping splits, the model achieved BLEU 84.33 and chrF 95.39 on 2,039 test sentences, with notable variation by length and genre. Qualitative review highlighted errors in discourse markers, cultural references, and colloquial forms. The study contributes a practical fine-tuning recipe, an efficient H100, BF16 setup, reproducibility protocols, and pedagogical insights, demonstrating that subtitle translation requires both technical accuracy and cultural equivalence.

Ⅰ. Introduction

Ⅱ. Research Background

Ⅲ. Methods & Experimental Design

Ⅳ. Results and Discussion

Ⅴ. Conclusion and Discussion

References

(0)

(0)

로딩중