One-class text classification with multi-modal deep support vector data description

6Citations
Citations of this article
68Readers
Mendeley users who have this article in their library.

Abstract

This work presents multi-modal deep SVDD (mSVDD) for one-class text classification. By extending the uni-modal SVDD to a multiple modal one, we build mSVDD with multiple hyperspheres, that enable us to build a much better description for target one-class data. Additionally, the end-to-end architecture of mSVDD can jointly handle neural feature learning and one-class text learning. We also introduce a mechanism for incorporating negative supervision in the absence of real negative data, which can be beneficial to the mSVDD model. We conduct experiments on Reuters and 20 Newsgroup datasets, and the experimental results demonstrate that mSVDD outperforms uni-modal SVDD and mSVDD can get further improvements when negative supervision is incorporated.

References Powered by Scopus

Long Short-Term Memory

76931Citations
N/AReaders
Get full text

GloVe: Global vectors for word representation

26880Citations
N/AReaders
Get full text

Reducing the dimensionality of data with neural networks

17221Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Deep One-Class Fine-Tuning for Imbalanced Short Text Classification in Transfer Learning

1Citations
N/AReaders
Get full text

Self-Supervised Anomaly Detection With Neural Transformations

0Citations
N/AReaders
Get full text

Self-Supervised Autoencoders for Visual Anomaly Detection

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Hu, C., Feng, Y., Kamigaito, H., Takamura, H., & Okumura, M. (2021). One-class text classification with multi-modal deep support vector data description. In EACL 2021 - 16th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference (pp. 3378–3390). Association for Computational Linguistics (ACL). https://doi.org/10.5715/jnlp.28.1053

Readers over time

‘21‘22‘23‘24‘2506121824

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 14

58%

Researcher 7

29%

Lecturer / Post doc 2

8%

Professor / Associate Prof. 1

4%

Readers' Discipline

Tooltip

Computer Science 21

78%

Linguistics 4

15%

Neuroscience 1

4%

Social Sciences 1

4%

Save time finding and organizing research with Mendeley

Sign up for free
0