What are Hard Samples for Gait Emotion Recognition?

Authors

DOI:

https://doi.org/10.5753/jis.2026.6799

Keywords:

Gait Analysis, Emotion Recognition, Dataset Cartography, Human Behavior Recognition, Biomechanics

Abstract

Emotion recognition is a relevant capability for understanding human behavior, enabling interactive systems to support natural interaction with their users. Among the diverse ways the literature tackles this task, gait emotion recognition (extracting the perceived emotion from how someone walks) remains challenging due to limited data availability and inconsistent annotations that often overlap, leading to ambiguous scenarios and harming training and validation processes. Given this challenge, we pose the following question: What are hard samples for gait emotion recognition? To answer this question, we investigate sample complexity in gait-based emotion recognition and its impact on learning dynamics, focusing on ambiguous and hard-to-learn samples using dataset cartography techniques that map samples based on model correctness and confidence. Our findings suggest that easier samples are those that highlight clear emotional cues, as expected, whereas ambiguous and difficult samples include restrained movements, static frames, and annotation inconsistencies. These insights can inform dataset curation and model training strategies for more reliable gait emotion recognition.

Downloads

Download data is not yet available.

References

Bhattacharya, U., Mittal, T., Chandra, R., Randhavane, T., Bera, A., and Manocha, D. (2024). Step: Spatial temporal graph convolutional networks for emotion perception from gaits. DOI: https://doi.org/10.1609/aaai.v34i02.5490.

Habibie, I., Holden, D., Schwarz, J., Yearsley, J., and Komura, T. (2017). A recurrent variational autoencoder for human motion synthesis. In Proceedings of the 28th British Machine Vision Conference (BMVC), London, United Kingdom. DOI: https://doi.org/10.5244/C.31.119.

Halovic, S. and Kroos, C. (2018). Not all is noticed: Kinematic cues of emotion-specific gait. Human Movement Science, 57:478–488. DOI: https://doi.org/10.1016/j.humov.2017.11.008.

Iwashita, Y., Kurazume, R., Ogawara, K., Tanaka, T., and Utsumi, A. (2013). Gait-based person identification robust to changes in appearance. IEEE Transactions on Image Processing, 22(6):2421–2431. DOI: https://doi.org/10.1109/TIP.2013.2246179.

Kipf, T. N. and Welling, M. (2017). Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR), 2017.

Li, J., Dai, X., Yan, R., Tang, C., and Li, Y. (2025). Multi-anchor adaptive fusion and bi-focus attention for enhanced gait-based emotion recognition. Scientific Reports, 15:97922. DOI: https://doi.org/10.1038/s41598-025-97922-3.

Liao, R., Yu, S., An, W., and Huang, Y. (2020). A model-based gait recognition method with body pose and human prior knowledge. Pattern Recognition, 98:107069. DOI: https://doi.org/10.1016/j.patcog.2019.107069.

Lima, M. L., de Lima Costa, W., Martinez, E. T., and Teichrieb, V. (2024). St-gait++: Leveraging spatio-temporal convolutions for gait-based emotion recognition on videos. DOI: https://doi.org/10.48550/arXiv.2405.13903.

Lopez, L. D., Reschke, P. J., Knothe, J. M., and Walle, E. A. (2017). Postural communication of emotion: Perception of distinct poses of five discrete emotions. Frontiers in Psychology, 8:710. DOI: https://doi.org/10.3389/fpsyg.2017.00710.

Makihara, Y., Mannami, H., Tsuji, A., Hossain, M. A., Sugiura, K., Mori, A., and Yagi, Y. (2012). The ou-isir gait database comprising the treadmill dataset. IPSJ Transactions on Computer Vision and Applications, 4:53–62. DOI: https://doi.org/10.2197/ipsjtcva.4.53.

Montepare, J. M., Goldstein, S. B., and Clausen, A. (1987). The identification of emotions from gait information. Journal of Nonverbal Behavior, 11(1):33–42. DOI: https://doi.org/10.1007/BF00999605.

Reynolds, R. M., Novotny, E., Lee, J., Roth, D., and Bente, G. (2019). Ambiguous bodies: The role of displayed arousal in emotion [mis]perception. Journal of Nonverbal Behavior, 43:529–548. DOI: https://doi.org/10.1007/s10919-019-00312-3.

Roether, C. L., Omlor, L., Christensen, A., and Giese, M. A. (2009). Critical features for the perception of emotion from gait. Journal of Vision, 9(6):15. DOI: https://doi.org/10.1167/9.6.15.

Swayamdipta, S., Schwartz, R., Lourie, N., Wang, Y., Hajishirzi, H., Smith, N. A., and Choi, Y. (2020). Dataset cartography: Mapping and diagnosing datasets with training dynamics. In Webber, B., Cohn, T., He, Y., and Liu, Y., editors, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 9275–9293, Online. Association for Computational Linguistics. DOI: https://doi.org/10.18653/v1/2020.emnlp-main.746.

Valli, A. (2007). Notes on natural interaction. Available at: [link]. Accessed on 22 April 2026.

Winter, D. A. (2009). Biomechanics and Motor Control of Human Movement. John Wiley & Sons, Hoboken, New Jersey, 4th edition.

Yan, S., Xiong, Y., and Lin, D. (2018). Spatial temporal graph convolutional networks for skeleton-based action recognition. Proceedings of the AAAI Conference on Artificial Intelligence, 32. DOI: https://doi.org/10.1609/aaai.v32i1.12328.

Yin, Y., Jing, L., Huang, F., Yang, G., and Wang, Z. (2022). Msa-gcn:multiscale adaptive graph convolution network for gait emotion recognition. DOI: https://doi.org/10.48550/arXiv.2209.08988.

Zhai, Y., Jia, G., Lai, Y.-K., Zhang, J., Yang, J., and Tao, D. (2024). Looking into gait for perceiving emotions via bilateral posture and movement graph convolutional networks. IEEE Transactions on Affective Computing, 15(3):1634–1648. DOI: https://doi.org/10.1109/TAFFC.2024.3365694.

Zhou, J., Xiong, H., Lu, J., Lin, Z., and Feng, B. (2025). Cgtgait: Collaborative graph and transformer for gait emotion recognition. DOI: https://doi.org/10.48550/arXiv.2509.16623.

Zhou, T., Wang, S., and Bilmes, J. (2020). Curriculum learning by dynamic instance hardness. In Advances in Neural Information Processing Systems, volume 33, pages 8602–8613. Curran Associates, Inc.

Downloads

Published

2026-04-29

How to Cite

SOUZA, T. K. S. de; COSTA, W. de L.; TEICHRIEB, V. What are Hard Samples for Gait Emotion Recognition?. Journal on Interactive Systems, Porto Alegre, RS, v. 17, n. 1, p. 387–397, 2026. DOI: 10.5753/jis.2026.6799. Disponível em: https://journals-sol.sbc.org.br/index.php/jis/article/view/6799. Acesso em: 11 may. 2026.

Issue

Section

Regular Paper

Most read articles by the same author(s)