Please use this identifier to cite or link to this item: http://dspace.iitrpr.ac.in:8080/xmlui/handle/123456789/2422
Title: Motion saliency based generative adversarial network for underwater moving object segmentation
Authors: Patil, P. W.
Thawakar, O
Dudhane, A.
Murala, S.
Keywords: Underwater medium
Underwater medium
Generative adversarial network
frame segmentation
Issue Date: 20-Aug-2021
Abstract: The underwater moving object segmentation is a challenging task. The problems like absorbing, scattering and attenuation of light rays between the scene and the imaging platform degrades the visibility of image or video frames. Also, the back-scattering of light rays further increases the problem of underwater video analysis, because the light rays interact with underwater particles and scattered back to the sensor. In this paper, a novel Motion Saliency Based Generative Adversarial Network (GAN) for Underwater Moving Object Segmentation (MOS) is proposed. The proposed network comprises of both identity mapping and dense connections for underwater MOS. To the best of our knowledge, this is the first paper with the concept of GAN-based unpaired learning for MOS in underwater videos. Initially, current frame motion saliency is estimated using few initial video frames and current frame. Further, estimated motion saliency is given as input to the proposed network for foreground estimation. To examine the effectiveness of proposed network, the Fish4Knowledge [1] underwater video dataset and challenging video categories of ChangeDetection.net-2014 [2] datasets are considered. The segmentation accuracy of existing state-of-the-art methods are used for comparison with proposed approach in terms of average F-measure. From experimental results, it is evident that the proposed network shows significant improvement as compared to the existing state-of-the-art methods for MOS.
URI: http://localhost:8080/xmlui/handle/123456789/2422
Appears in Collections:Year-2019

Files in This Item:
File Description SizeFormat 
Full Text.pdf1.28 MBAdobe PDFView/Open    Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.