Please use this identifier to cite or link to this item: http://dspace.iitrpr.ac.in:8080/xmlui/handle/123456789/3550
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNiyaz, U.-
dc.contributor.authorBathula, D.R.-
dc.date.accessioned2022-06-23T17:35:05Z-
dc.date.available2022-06-23T17:35:05Z-
dc.date.issued2022-06-23-
dc.identifier.urihttp://localhost:8080/xmlui/handle/123456789/3550-
dc.description.abstractKnowledge distillation (KD) is an effective model compression technique where a compact student network is taught to mimic the behavior of a complex and highly trained teacher network. In contrast, Mutual Learning (ML) provides an alternative strategy where multiple simple student networks benefit from sharing knowledge, even in the absence of a powerful but static teacher network. Motivated by these findings, we propose a single-teacher, multi-student framework that leverages both KD and ML to achieve better performance. Furthermore, an online distillation strategy is utilized to train the teacher and students simultaneously. To evaluate the performance of the proposed approach, extensive experiments were conducted using three different versions of teacher-student networks on benchmark biomedical classification (MSI vs. MSS) and object detection (Polyp Detection) tasks. Ensemble of student networks trained in the proposed manner achieved better results than the ensemble of students trained using KD or ML individually, establishing the benefit of augmenting knowledge transfer from teacher to students with peer-to-peer learning between students.en_US
dc.language.isoen_USen_US
dc.subjectKnowledge distillationen_US
dc.subjectOnline distillationen_US
dc.subjectPeer Mutual learningen_US
dc.subjectTeacher-student networken_US
dc.titleAugmenting Knowledge Distillation with Peer-to-Peer Mutual Learning for Model Compressionen_US
dc.typeArticleen_US
Appears in Collections:Year-2022

Files in This Item:
File Description SizeFormat 
Full Text.pdf2.83 MBAdobe PDFView/Open    Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.