Aim:
The aim of this study is to enhance emotion analysis in the cryptocurrency market using a GPT-2 model, providing a nuanced understanding of public perceptions towards cryptocurrency transactions. This proposed system leverages the capabilities of GPT-2 to predict positive, negative, or neutral emotions associated with cryptocurrency-related tweets.
Abstract:
The cryptocurrency market has witnessed unprecedented growth, prompting concerns and discussions about its widespread use. In this study, we focus on emotion analysis of cryptocurrency-related tweets to gain insights into public perceptions. The proposed GPT-2-enhanced system eliminates the need for LSTM and GRU models, relying solely on the advanced language generation capabilities of GPT-2. Various feature extraction methods, including term frequency-inverse document frequency, word2vec, and bag of words, are explored. Emotion analysis is conducted using lexmo.
Existing Method:
The existing method involves sentiment and emotion analysis using LSTM-GRU models. However, the proposed system eliminates the need for recurrent neural networks, relying solely on the GPT-2 language model for a more streamlined and effective analysis of cryptocurrency-related emotions.
Proposed Method:
The proposed method introduces a GPT-2-enhanced system for emotion analysis in the cryptocurrency market. This involves utilizing the advanced language generation capabilities of GPT-2 for accurate prediction of positive, negative, or neutral emotions associated with cryptocurrency-related tweets.The proposed system eliminates the need for LSTM and GRU models, streamlining the architecture for improved efficiency.GPT-2’s language model capabilities provide a more nuanced and contextually aware analysis of emotions in cryptocurrency discussions.The complex nature of GPT-2 may make it challenging to interpret and understand the decision-making process, potentially limiting the model’s transparency.Dependency on pre-trained models: The performance of the GPT-2-enhanced system relies on the quality and relevance of the pre-trained GPT-2 model.
Reviews
There are no reviews yet.