Please use this identifier to cite or link to this item:
http://dspace.iitrpr.ac.in:8080/xmlui/handle/123456789/1039
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Sikka, A. | |
dc.contributor.author | Peri, S.V. | |
dc.contributor.author | Bathula, D.R. | |
dc.date.accessioned | 2018-12-20T10:59:47Z | |
dc.date.available | 2018-12-20T10:59:47Z | |
dc.date.issued | 2018-12-20 | |
dc.identifier.uri | http://localhost:8080/xmlui/handle/123456789/1039 | |
dc.description.abstract | Recent studies suggest that combined analysis of Magnetic resonance imaging (MRI) that measures brain atrophy and positron emission tomography (PET) that quanti es hypo-metabolism provides improved accuracy in diagnosing Alzheimer's disease. However, such techniques are limited by the availability of corresponding scans of each modality. Current work focuses on a cross-modal approach to estimate FDG-PET scans for the given MR scans using a 3D U-Net architecture. The use of the complete MR image instead of a local patch based approach helps in capturing non-local and non-linear correlations between MRI and PET modalities. The quality of the estimated PET scans is measured using quantitative metrics such as MAE, PSNR and SSIM. The efficacy of the proposed method is evaluated in the context of Alzheimer's disease classi cation. The accuracy using only MRI is 70.18% while joint classi cation using synthesized PET and MRI is 74.43% with a p-value of 0:06. The signi cant improvement in diagnosis demonstrates the utility of the synthesized PET scans for multi-modal analysis. | en_US |
dc.language.iso | en_US | en_US |
dc.title | MRI to FDG-PET: cross-modal synthesis using 3D u-net for multi-modal alzheimer’s classification | en_US |
dc.title.alternative | en_US | |
dc.type | Article | en_US |
Appears in Collections: | Year-2018 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Full Text.pdf | 1.33 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.