Advancement of Discrimination Accuracy of Musical Development Degree Utilizing Machine Learning with Optimized Feature Quantities Extracted from Simultaneous Analysis of Both Eye Movement and Body Movement During Musical Expression in Early Childhood
DOI:
https://doi.org/10.63002/assm.210.665Keywords:
musical expression, eye movement, body movement, simultaneous analysis, a two-way analysis of variance, machine learning, classifierAbstract
To measure musical development of children by expert observation has been essential part of musical education. However, how exactly evaluate such development degree is often hard to quantify. A valid and reliable tool is needed, ideally measuring objective movement data extracted from full body including eyes. This study aims to extract feature quantities effectively contributing to classification of developmental process of musical expression in early childhood through the simultaneous analysis of both eye movement and body movement in 3-year-old, 4-year-old, and 5-year-old’s musical expression for the specific song. Therefore, the calculated data for the two years of 2022 and 2023 were applied to a two-way analysis of variance by the facility factor and age factor. As a result, a statistically significant difference was observed in the moving distance and the moving average velocity such as head, right shoulder, and right hand, and the moving average acceleration such as right shoulder and right hand, and the moving smoothness of right hand.
Furthermore, feature quantities were extracted from those results of quantitative analysis based on simultaneous analysis of both eye movement and body movement in musical expression from 2022 year to 2023 year, implemented to machine learning and the classification accuracy was calculated. Vision is obviously important to facilitate movement and stabilization of posture as well as to enable music induced body parts motion. As a result of machine learning, several classifiers such as NN (neural network model) and SVM (support vector machine) showed a better classification accuracy when implementing the both eye movement data and body movement data in musical expression in the simultaneous analysis than only body movement data in musical expression. Specifically, a result when both eye movement data and body movement data in musical expression applying to machine learning using NN showed 74.42 % in classification accuracy although 55.81% in the classification accuracy by only body movement data. Those results verified greater efficacy of simultaneous analysis in musical expression and strong linkage between eye movement and body movement in musical expression in early childhood.
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Mina Sano
This work is licensed under a Creative Commons Attribution 4.0 International License.