Please use this identifier to cite or link to this item: http://dspace.iitrpr.ac.in:8080/xmlui/handle/123456789/2473
Title: FgGAN: a cascaded unpaired learning for background estimation and foreground segmentation
Authors: Patil, P. W.
Murala, S.
Issue Date: 25-Aug-2021
Abstract: The moving object segmentation (MOS) in videos with bad weather, irregular motion of objects, camera jitter, shadow and dynamic background scenarios is still an open problem for computer vision applications. To address these issues, in this paper, we propose an approach named as Foreground Generative Adversarial Network (FgGAN) with the recent concepts of generative adversarial network (GAN) and unpaired training for background estimation and foreground segmentation. To the best of our knowledge, this is the first paper with the concept of GAN-based unpaired learning for MOS. Initially, video-wise background is estimated using GAN-based unpaired learning network (network-I). Then, to extract the motion information related to foreground, motion saliency is estimated using estimated background and current video frame. Further, estimated motion saliency is given as input to the GANbased unpaired learning network (network-II) for foreground segmentation. To examine the effectiveness of proposed FgGAN (cascaded networks I and II), the challenging video categories like dynamic background, bad weather, intermittent object motion and shadow are collected from ChangeDetection.net-2014 [26] database. The segmentation accuracy is observed qualitatively and quantitatively in terms of F-measure and percentage of wrong classification (PWC) and compared with the existing state-of-the-art methods. From experimental results, it is evident that the proposed FgGAN shows significant improvement in terms of F-measure and PWC as compared to the existing stateof-the-art methods for MOS.
URI: http://localhost:8080/xmlui/handle/123456789/2473
Appears in Collections:Year-2019

Files in This Item:
File Description SizeFormat 
Full Text.pdf1.17 MBAdobe PDFView/Open    Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.