Please use this identifier to cite or link to this item:
http://dspace.iitrpr.ac.in:8080/xmlui/handle/123456789/3932
Title: | NEAT activity detection using smartwatch at low sampling frequency |
Authors: | Dewan, A. Gunturi, V.M.V. Naik, V. Dutta, K.K. |
Keywords: | Activity recognition Low frequency Smart watch sensors |
Issue Date: | 26-Aug-2022 |
Abstract: | Our paper aims to build a classification model to discern the typical NEAT (Non-Exercise Activity Thermogenesis) activities done in a home setting. The concept of NEAT is broadly defined as the energy spent in everything which is not sleeping, eating, or a traditional form of physical exercise. We focus on the following NEAT and non-NEAT activities in this paper - cooking, sweeping, mopping, walking, climbing up, climbing down, and non-NEAT activities (e.g., watching television and working on a desk). This aim is to build a classification model which can work with data sampled at a low frequency of 1Hz. However, building such a classifier is non-trivial because the NEAT activities are not easily separable in low-frequency data. The state-of-the-art in the area of human activity recognition either uses multiple physical devices (e.g., accelerometers on arms, waist, and feet) for data collection or use data that is sampled at high frequency (20Hz or above). In contrast, our model performs NEAT activity recognition using data sampled at 1Hz and from a single smartwatch worn on the dominant hand. Thus, making it more energy-efficient and easily usable for widespread use. We evaluate our proposed model using actual data collected on a smartwatch, and we compare it with alternative models. Our results indicate that the proposed model is able to achieve much higher accuracy than the alternative approaches. |
URI: | http://localhost:8080/xmlui/handle/123456789/3932 |
Appears in Collections: | Year-2021 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Full Text.pdf | 1.21 MB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.