Exploring ChatGPT for Toxicity Detection in GitHub

1Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

Fostering a collaborative and inclusive environment is crucial for the sustained progress of open source development. However, the prevalence of negative discourse, often manifested as toxic comments, poses significant challenges to developer well-being and productivity. To identify such negativity in project communications, especially within large projects, automated toxicity detection models are necessary. To train these models effectively, we need large software engineering-specific toxicity datasets. However, such datasets are limited in availability and often exhibit imbalance (e.g., only 6 in 1000 GitHub issues are toxic) [1], posing challenges for training effective toxicity detection models. To address this problem, we explore a zero-shot LLM (ChatGPT) that is pre-trained on massive datasets but without being fine-tuned specifically for the task of detecting toxicity in software-related text. Our preliminary evaluation indicates that ChatGPT shows promise in detecting toxicity in GitHub, and warrants further investigation. We experimented with various prompts, including those designed for justifying model outputs, thereby enhancing model interpretability and paving the way for potential integration of ChatGPT-enabled toxicity detection into developer communication channels.

References Powered by Scopus

Are bullies more productive? Empirical study of affectiveness vs. issue fixing time

141Citations
N/AReaders
Get full text

ChatGPT and Software Testing Education: Promises & Perils

135Citations
N/AReaders
Get full text

Is ChatGPT better than Human Annotators? Potential and Limitations of ChatGPT in Explaining Implicit Hate Speech

105Citations
N/AReaders
Get full text

Cited by Powered by Scopus

An Empirical Evaluation of the Zero-Shot, Few-Shot, and Traditional Fine-Tuning Based Pretrained Language Models for Sentiment Analysis in Software Engineering

1Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Mishra, S., & Chatterjee, P. (2024). Exploring ChatGPT for Toxicity Detection in GitHub. In Proceedings - International Conference on Software Engineering (pp. 6–10). IEEE Computer Society. https://doi.org/10.1145/3639476.3639777

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 2

100%

Readers' Discipline

Tooltip

Computer Science 1

33%

Mathematics 1

33%

Business, Management and Accounting 1

33%

Save time finding and organizing research with Mendeley

Sign up for free