Comparing pre-trained and fine-tuned transformer-based models for sentiment analysis in Turkish comments in student surveys

Yükleniyor...
Küçük Resim

Tarih

2025-08-15

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Institute of Electrical and Electronics Engineers Inc.

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Araştırma projeleri

Organizasyon Birimleri

Dergi sayısı

Özet

Student surveys are essential for evaluating teaching quality and course content, but analyzing open-ended responses is challenging due to their unstructured and multilingual nature. This study applies sentiment analysis to Turkish educational survey responses using three transformer-based models: SAVASY, DBMDZ BERT Base Turkish Cased, and XLM-RoBERTa Base. A labeled dataset of real-world student comments was used, with sentiment labels assigned using the Gemini AI tool to facilitate model fine-tuning. Evaluation metrics included accuracy, F1-score, precision, recall, and confidence scores. Results show that fine-tuning improves sentiment classification, effectively identifying positive, negative, and neutral sentiments. This highlights the value of transformer models in analyzing Turkish student feedback.

Açıklama

Anahtar Kelimeler

Sentiment analysis, Transformer models, FineTuning, Pre-training, Student surveys, Curricula, Education computing, Feedback, Information systems, Integrated circuits, Labeled data, Students, Teaching, Course contents, Fine tuning, Open-ended response, Quality content, Sentiment analysis, Teaching quality, Transformer modeling, Turkishs

Kaynak

2025 33rd Signal Processing and Communications Applications Conference (SIU)

WoS Q Değeri

N/A

Scopus Q Değeri

N/A

Cilt

Sayı

Künye