BoW-based neural networks vs. cutting-edge models for single-label text classification

Document Type

Article

Source of Publication

Neural Computing and Applications

Publication Date

1-1-2023

Abstract

To reliably and accurately classify complicated "big" datasets, machine learning models must be continually improved. This research proposes straightforward yet competitive neural networks for text classification, even though graph neural networks (GNN) have reignited interest in graph-based text classification models. Convolutional neural networks (CNN), artificial neural networks (ANN), and their refined “fine-tuned” models (denoted as FT-CNN and FT-ANN) are the names given to our proposed models. The models presented in this paper demonstrate that our simple models like (CNN, ANN, FT-CNN, and FT-ANN) can perform better than more complex GNN ones such as (SGC, SSGC, and TextGCN) and are comparable to others (i.e., HyperGAT and Bert). The process of fine-tuning is also highly recommended because it improves the performance and reliability of models. The performance of our suggested models on five benchmark datasets (namely, Reuters (R8), R52, 20NewsGroup, Ohsumed, and Mr) is vividly illustrated. According to the experimental findings, on the majority of the target datasets, these models—especially those that have been fine-tuned—perform surprisingly better than SOTA approaches, including GNN-based models.

ISSN

0941-0643

Publisher

Springer Science and Business Media LLC

Disciplines

Computer Sciences

Keywords

Data mining, Machine learning, Neural networks, Text classification

Scopus ID

85165126995

Indexed in Scopus

yes

Open Access

no

Share

COinS