INSTITUTIONAL DIGITAL REPOSITORY

Automatic group affect analysis in images via visual attribute and feature networks

Show simple item record

dc.contributor.author Ghosh, S.
dc.contributor.author Dhall, A.
dc.contributor.author Sebe, N.
dc.date.accessioned 2019-05-16T12:42:27Z
dc.date.available 2019-05-16T12:42:27Z
dc.date.issued 2019-05-16
dc.identifier.uri http://localhost:8080/xmlui/handle/123456789/1246
dc.description.abstract This paper proposes a pipeline for automatic group-level affect analysis. A deep neural network-based approach, which leverages on the facial-expression information, scene information and a high-level facial visual attribute information is proposed. A capsule network-based architecture is used to predict the facial expression. Transfer learning is used on Inception-V3 to extract global image-based features which contain scene information. Another network is trained for inferring the facial attributes of the group members. Further, these attributes are pooled at a group-level to train a network for inferring the group-level affect. The facial attribute prediction network, although is simple yet, is effective and generates result comparable to the state-of-the-art methods. Later, model integration is performed from the three channels. The experiments show the effectiveness of the proposed techniques on three ‘in the wild’ databases: Group Affect Database, HAPPEI and UCLA-Protest database. en_US
dc.language.iso en_US en_US
dc.subject Group level affect recognition. en_US
dc.title Automatic group affect analysis in images via visual attribute and feature networks en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account