Assessment of Off-the-Shelf SE-specific Sentiment Analysis Tools: An Extended Replication Study

Nicole Novielli, Fabio Calefato, Filippo Lanubile, Alexander Serebrenik

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Sentiment analysis methods have become popular for investigating human communication, including discussions related to software projects. Since general-purpose sentiment analysis tools do not fit well with the information exchanged by software developers, new tools, specific for software engineering (SE), have been developed. We investigate to what extent off-the-shelf SE-specific tools for sentiment analysis mitigate the threats to conclusion validity of empirical studies in software engineering, highlighted by previous research. First, we replicate two studies addressing the role of sentiment insecurity discussions on GitHub and in question-writing on Stack Overflow.Then, we extend the previous studies by assessing to what extent the tools agree with each other and with the manual annotation on a gold standard of 600 documents. We find that different SE-specific sentiment analysis tools might lead to contradictory results at a fine-grain level, when used off-the-shelf.Conversely, platform-specific tuning or retraining might be needed to take into account differences in platform conventions, jargon, or document lengths.
Original languageEnglish
Article number77
JournalEmpirical Software Engineering
Volume26
Issue number4
Publication statusPublished - Jun 2021

Fingerprint Dive into the research topics of 'Assessment of Off-the-Shelf SE-specific Sentiment Analysis Tools: An Extended Replication Study'. Together they form a unique fingerprint.

Cite this