More is Not Always Better: Exploring Early Repair of DNNs

DNN repair is an effective technique applied after training to enhance the class-specific accuracy of classifier models, where a low failure rate is required on specific classes. The repair methods introduced in recent studies assume that they are applied to fully trained models. In this paper, we a...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2024 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning (DeepTest) s. 13 - 16
Hlavní autoři: Mancu, Andrei, Laurent, Thomas, Rieger, Franz, Arcaini, Paolo, Ishikawa, Fuyuki, Rueckert, Daniel
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: ACM 20.04.2024
Témata:
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:DNN repair is an effective technique applied after training to enhance the class-specific accuracy of classifier models, where a low failure rate is required on specific classes. The repair methods introduced in recent studies assume that they are applied to fully trained models. In this paper, we argue that this could not always be the best choice. We analyse the performance of DNN models under various training times and repair combinations. Through meticulously designed experiments on two real-world datasets and a carefully curated assessment score, we show that applying DNN repair earlier in the training process, and not only at its end, can be beneficial. Thus, we encourage the research community to consider the idea of when to apply DNN repair in the model development.
DOI:10.1145/3643786.3648024