Fine grained irony classification through transfer learning approach
DOI:
https://doi.org/10.11591/csit.v4i1.pp43-49Keywords:
Bidirectional long-short term memory, Encoder, Irony, Natural language processing, Sentiment analysis, TransformerAbstract
Nowadays irony appears to be pervasive in all social media discussion forums and chats, offering further obstacles to sentiment analysis efforts. The aim of the present research work is to detect irony and its types in English tweets We employed a new system for irony detection in English tweets, and we propose a distilled bidirectional encoder representations from transformers (DistilBERT) light transformer model based on the bidirectional encoder representations from transformers (BERT) architecture, this is further strengthened by the use and design of bidirectional long-short term memory (Bi-LSTM) network this configuration minimizes data preprocessing tasks proposed model tests on a SemEval2018 task 3, 3,834 samples were provided. Experiment results show the proposed system has achieved a precision of 81% for not irony class and 66% for irony class, recall of 77% for not irony and 72% for irony, and F1 score of 79% for not irony and 69% for irony class since researchers have come up with a binary classification model, in this study we have extended our work for multiclass classification of irony. It is significant and will serve as a foundation for future research on different types of irony in tweets.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Institute of Advanced Engineering and Science

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.