Development of a Human Activity Recognition System for Ballet Tasks

Background: Accurate and detailed measurement of a dancer’s training volume is a key requirement to understanding the relationship between a dancer’s pain and training volume. Currently, no system capable of quantifying a dancer’s training volume, with respect to specific movement activities, exists...

Full description

Bibliographic Details
Main Authors: Hendry, Danica, Chai, K., Campbell, Amity, Hopper, L., O’Sullivan, P., Straker, Leon
Format: Journal Article
Language:English
Published: 2020
Online Access:http://hdl.handle.net/20.500.11937/79987
_version_ 1848764137760358400
author Hendry, Danica
Chai, K.
Campbell, Amity
Hopper, L.
O’Sullivan, P.
Straker, Leon
author_facet Hendry, Danica
Chai, K.
Campbell, Amity
Hopper, L.
O’Sullivan, P.
Straker, Leon
author_sort Hendry, Danica
building Curtin Institutional Repository
collection Online Access
description Background: Accurate and detailed measurement of a dancer’s training volume is a key requirement to understanding the relationship between a dancer’s pain and training volume. Currently, no system capable of quantifying a dancer’s training volume, with respect to specific movement activities, exists. The application of machine learning models to wearable sensor data for human activity recognition in sport has previously been applied to cricket, tennis and rugby. Thus, the purpose of this study was to develop a human activity recognition system using wearable sensor data to accurately identify key ballet movements (jumping and lifting the leg). Our primary objective was to determine if machine learning can accurately identify key ballet movements during dance training. The secondary objective was to determine the influence of the location and number of sensors on accuracy. Results: Convolutional neural networks were applied to develop two models for every combination of six sensors (6, 5, 4, 3, etc.) with and without the inclusion of transition movements. At the first level of classification, including data from all sensors, without transitions, the model performed with 97.8% accuracy. The degree of accuracy reduced at the second (83.0%) and third (75.1%) levels of classification. The degree of accuracy reduced with inclusion of transitions, reduction in the number of sensors and various sensor combinations. Conclusion: The models developed were robust enough to identify jumping and leg lifting tasks in real-world exposures in dancers. The system provides a novel method for measuring dancer training volume through quantification of specific movement tasks. Such a system can be used to further understand the relationship between dancers’ pain and training volume and for athlete monitoring systems. Further, this provides a proof of concept which can be easily translated to other lower limb dominant sporting activities
first_indexed 2025-11-14T11:14:35Z
format Journal Article
id curtin-20.500.11937-79987
institution Curtin University Malaysia
institution_category Local University
language eng
last_indexed 2025-11-14T11:14:35Z
publishDate 2020
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-799872021-01-08T07:54:28Z Development of a Human Activity Recognition System for Ballet Tasks Hendry, Danica Chai, K. Campbell, Amity Hopper, L. O’Sullivan, P. Straker, Leon Background: Accurate and detailed measurement of a dancer’s training volume is a key requirement to understanding the relationship between a dancer’s pain and training volume. Currently, no system capable of quantifying a dancer’s training volume, with respect to specific movement activities, exists. The application of machine learning models to wearable sensor data for human activity recognition in sport has previously been applied to cricket, tennis and rugby. Thus, the purpose of this study was to develop a human activity recognition system using wearable sensor data to accurately identify key ballet movements (jumping and lifting the leg). Our primary objective was to determine if machine learning can accurately identify key ballet movements during dance training. The secondary objective was to determine the influence of the location and number of sensors on accuracy. Results: Convolutional neural networks were applied to develop two models for every combination of six sensors (6, 5, 4, 3, etc.) with and without the inclusion of transition movements. At the first level of classification, including data from all sensors, without transitions, the model performed with 97.8% accuracy. The degree of accuracy reduced at the second (83.0%) and third (75.1%) levels of classification. The degree of accuracy reduced with inclusion of transitions, reduction in the number of sensors and various sensor combinations. Conclusion: The models developed were robust enough to identify jumping and leg lifting tasks in real-world exposures in dancers. The system provides a novel method for measuring dancer training volume through quantification of specific movement tasks. Such a system can be used to further understand the relationship between dancers’ pain and training volume and for athlete monitoring systems. Further, this provides a proof of concept which can be easily translated to other lower limb dominant sporting activities 2020 Journal Article http://hdl.handle.net/20.500.11937/79987 10.1186/s40798-020-0237-5 eng http://creativecommons.org/licenses/by/4.0/ fulltext
spellingShingle Hendry, Danica
Chai, K.
Campbell, Amity
Hopper, L.
O’Sullivan, P.
Straker, Leon
Development of a Human Activity Recognition System for Ballet Tasks
title Development of a Human Activity Recognition System for Ballet Tasks
title_full Development of a Human Activity Recognition System for Ballet Tasks
title_fullStr Development of a Human Activity Recognition System for Ballet Tasks
title_full_unstemmed Development of a Human Activity Recognition System for Ballet Tasks
title_short Development of a Human Activity Recognition System for Ballet Tasks
title_sort development of a human activity recognition system for ballet tasks
url http://hdl.handle.net/20.500.11937/79987