Please use this identifier to cite or link to this item:
http://dspace.iitrpr.ac.in:8080/xmlui/handle/123456789/2247
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chauhan, J. | - |
dc.contributor.author | Goyal, P. | - |
dc.date.accessioned | 2021-07-28T17:22:29Z | - |
dc.date.available | 2021-07-28T17:22:29Z | - |
dc.date.issued | 2021-07-28 | - |
dc.identifier.uri | http://localhost:8080/xmlui/handle/123456789/2247 | - |
dc.description.abstract | Background and objective: Burns are a serious health problem leading to several thousand deaths annually, and despite the growth of science and technology, automated burns diagnosis still remains a major challenge. Researchers have been exploring visual imagesbased automated approaches for burn diagnosis. Noting that the impact of a burn on a particular body part can be related to the skin thickness factor, we propose a deep convolutional neural network based body part-specific burns severity assessment model (BPBSAM). Method: Considering skin anatomy, BPBSAM estimates burn severity using body part-specific support vector machines trained with CNN features extracted from burnt body part images. Thus BPBSAM first identifies the body part of the burn images using a convolutional neural network in training of which the challenge oflimited availability of burnt body partimages is successfully addressed by using available larger-size datasets of non-burn images of different body parts considered (face, hand, back, and inner forearm). We prepared a rich labelled burn images datasets: BI & UBI and trained several deep learning models with existing models as pipeline for body part classification and feature extraction for severity estimation. Results:The proposednovel BPBSAMmethod classified the severity of burnfromcolor images of burn injury with an overall average F1 score of 77.8% and accuracy of 84.85% for the test BI dataset and 87.2% and 91.53% for the UBI dataset, respectively. For burn images body part classification, the average accuracy of around 93% is achieved, and for burn severity assessment, the proposed BPBSAM outperformed the generic method in terms of overall average accuracy by 10.61%, 4.55%, and 3.03% with pipelines ResNet50, VGG16, and VGG19, respectively. Conclusions: The main contributions of this work along with burn images labelled datasets creation is thatthe proposed customized body part-specific burn severity assessment model can significantly improve the performance in spite of having small burn images dataset. This highly innovative customized body part-specific approachcould also be used to deal withthe burnregionsegmentationproblem. Moreover,fine tuning onpre-trainednon-burnbody part images network has proven to be robust and reliable. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | Burns | en_US |
dc.subject | Burn images | en_US |
dc.subject | Body part images | en_US |
dc.subject | Classification | en_US |
dc.subject | Deep learning | en_US |
dc.subject | Severity assessment | en_US |
dc.title | BPBSAM: body part-specific burn severity assessment model | en_US |
dc.type | Article | en_US |
Appears in Collections: | Year-2020 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Full Text.pdf | 3.48 MB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.