Enriching Transformer-Based Embeddings for Emotion Identification in an Agglutinative Language: Turkish

Loading...
Publication Logo

Date

2023

Authors

Aka Uymaz, Hande
Kumova Metin, Senem

Journal Title

Journal ISSN

Volume Title

Publisher

Ieee Computer Soc

Open Access Color

Green Open Access

No

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Average
Influence
Average
Popularity
Average

Research Projects

Journal Issue

Abstract

Text-based emotion detection is an important and expanding research area due to the increasing accessibility of written data via the Internet and social media. Vector space models, such as semantic and contextual methods, are frequently used in many domains in natural language processing. Currently, to improve performance in emotion/sentiment detection studies, a new research area has emerged, which involves adding extra emotion information (emotion enrichment) to these models. Furthermore, as emotion depends on multiple parameters, the success of enrichment may vary based on different languages. In this study, we applied two emotion-enrichment methods on emerging transformer-based models [bidirectional encoder representations from transformers (BERT), a robustly optimized BERT pretraining approach, a distilled version of BERT, and efficiently learning an encoder that classifies token replacements accurately] and a traditional semantic model (Word2Vec) (as a baseline) on the Turkish (a highly agglutinative language) dataset. The performance was analyzed with classification models and cosine-similarity metrics.

Description

Keywords

Measurement, Emotion recognition, Social networking (online), Semantics, Bidirectional control, Transformers, Encoding

Fields of Science

Citation

WoS Q

Q2

Scopus Q

Q2
OpenCitations Logo
OpenCitations Citation Count
N/A

Source

It Professional

Volume

25

Issue

4

Start Page

67

End Page

73
PlumX Metrics
Citations

Scopus : 0

Captures

Mendeley Readers : 9

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
0.1746

Sustainable Development Goals