More is Not Always Better: Exploring Early Repair of DNNs

DNN repair is an effective technique applied after training to enhance the class-specific accuracy of classifier models, where a low failure rate is required on specific classes. The repair methods introduced in recent studies assume that they are applied to fully trained models. In this paper, we a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:2024 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning (DeepTest) S. 13 - 16
Hauptverfasser: Mancu, Andrei, Laurent, Thomas, Rieger, Franz, Arcaini, Paolo, Ishikawa, Fuyuki, Rueckert, Daniel
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: ACM 20.04.2024
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:DNN repair is an effective technique applied after training to enhance the class-specific accuracy of classifier models, where a low failure rate is required on specific classes. The repair methods introduced in recent studies assume that they are applied to fully trained models. In this paper, we argue that this could not always be the best choice. We analyse the performance of DNN models under various training times and repair combinations. Through meticulously designed experiments on two real-world datasets and a carefully curated assessment score, we show that applying DNN repair earlier in the training process, and not only at its end, can be beneficial. Thus, we encourage the research community to consider the idea of when to apply DNN repair in the model development.
DOI:10.1145/3643786.3648024