Comparison of systematically derived software metrics thresholds for object-oriented programming languages

Without reliable software metrics threshold values, the efficient quality evaluation of software could not be done. In order to derive reliable thresholds, we have to address several challenges, which impact the final result. For instance, software metrics implementations vary in various software me...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer Science and Information Systems Jg. 17; H. 1; S. 181 - 203
Hauptverfasser: Beranic, Tina, Hericko, Marjan
Format: Journal Article
Sprache:Englisch
Veröffentlicht: 2020
ISSN:1820-0214, 2406-1018
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Without reliable software metrics threshold values, the efficient quality evaluation of software could not be done. In order to derive reliable thresholds, we have to address several challenges, which impact the final result. For instance, software metrics implementations vary in various software metrics tools, including varying threshold values that result from different threshold derivation approaches. In addition, the programming language is also another important aspect. In this paper, we present the results of an empirical study aimed at comparing systematically obtained threshold values for nine software metrics in four object-oriented programming languages (i.e., Java, C++, C#, and Python).We addressed challenges in the threshold derivation domain within introduced adjustments of the benchmarkbased threshold derivation approach. The data set was selected in a uniform way, allowing derivation repeatability, while input values were collected using a single software metric tool, enabling the comparison of derived thresholds among the chosen object-oriented programming languages.Within the performed empirical study, the comparison reveals that threshold values differ between different programming languages. nema
ISSN:1820-0214
2406-1018
DOI:10.2298/CSIS181012035B