Please use this identifier to cite or link to this item: http://dspace.iitrpr.ac.in:8080/xmlui/handle/123456789/3226
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKamakshi, V.-
dc.contributor.authorGupta, U.-
dc.contributor.authorKrishnan, N. C.-
dc.date.accessioned2021-11-22T09:36:40Z-
dc.date.available2021-11-22T09:36:40Z-
dc.date.issued2021-11-22-
dc.identifier.urihttp://localhost:8080/xmlui/handle/123456789/3226-
dc.description.abstractDeep CNNs, though have achieved the state of the art performance in image classification tasks, remain a black-box to a human using them. There is a growing interest in explaining the working of these deep models to improve their trustworthiness. In this paper, we introduce a Posthoc Architecture-agnostic Concept Extractor (PACE) that automatically extracts smaller sub-regions of the image called concepts relevant to the black-box prediction. PACE tightly integrates the faithfulness of the explanatory framework to the black-box model. To the best of our knowledge, this is the first work that extracts class-specific discriminative concepts in a posthoc manner automatically. The PACE framework is used to generate explanations for two different CNN architectures trained for classifying the AWA2 and Imagenet-Birds datasets. Extensive human subject experiments are conducted to validate the human interpretability and consistency of the explanations extracted by PACE. The results from these experiments suggest that over 72% of the concepts extracted by PACE are human interpretable.en_US
dc.language.isoen_USen_US
dc.subjectXAIen_US
dc.subjectposthoc explanationsen_US
dc.subjectconcept-based explanationsen_US
dc.subjectimage classifier explanationsen_US
dc.titlePACE: posthoc Architecture-Agnostic concept extractor for explaining CNNsen_US
dc.typeArticleen_US
Appears in Collections:Year-2021

Files in This Item:
File Description SizeFormat 
Full Text.pdf15.17 MBAdobe PDFView/Open    Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.