Exploring approaches to compare emotional responses to the same stimuli from different individuals

Authors

DOI:

https://doi.org/10.5753/jis.2025.5421

Keywords:

Affective Computing, Facial Expressions, Comparison of emotional responses

Abstract

Understand human emotional behavior is a challenging yet crucial endeavor for enhancing user experience through the application of Affective Computing techniques. These methods have the potential to foster more natural and emotionally responsive interactions between users and systems. In a multicultural world, it is equally important to understand the varied emotional responses individuals have to a same stimuli, ensuring that software adaptations and interventions are appropriately tailored. This study examines two methodologies for comparing how individuals from diverse backgrounds react emotionally to the two audiovisual stimulus. By utilizing data analysis and machine learning tools, the study aims to explore whether two individuals react similarly to a same stimulus. Alongside an exploration of the unique features of each approach, the research validates these methods by analyzing the emotional responses of 39 participants, identifying both commonalities and differences. The findings not only underscore the approaches' effectiveness but also highlight their potential for complementing one another.

Downloads

Download data is not yet available.

References

Aguiar, G., Esteves, J., Júnior, C., Nascimento, T., and Aranha, R. (2024). Únicos, mas não incomparáveis: abordagens para identificação de similaridades em respostas emocionais de diferentes indivíduos ao mesmo estímulo audiovisual. In Proceedings of the 30th Brazilian Symposium on Multimedia and the Web, pages 336–344, Porto Alegre, RS, Brasil. SBC. DOI: http://doi.org/10.5753/webmedia.2024.241432.

Agung, E. S., Rifai, A. P., and Wijayanto, T. (2024). Image-based facial emotion recognition using convolutional neural network on emognition dataset. Scientific reports, 14(1):14429. DOI: http://doi.org/10.1038/s41598-024-65276-x.

Aranha, R. V., Cordeiro, L. N., Sales, L. M., and Nunes, F. L. S. (2021). Engagement and discrete emotions in game scenario: Is there a relation among them? In Ardito, C., Lanzilotti, R., Malizia, A., Petrie, H., Piccinno, A., Desolda, G., and Inkpen, K., editors, Human-Computer Interaction – INTERACT 2021, pages 22–42, Cham. Springer International Publishing. DOI: http://doi.org/10.1007/978-3-030-85613-7_3.

Assunção, W. G. d. and Neris, V. P. d. A. (2019). mmotion: a mobile application for music recommendation that considers the desired emotion of the user. In Proceedings of the 18th Brazilian Symposium on Human Factors in Computing Systems, IHC '19, page 1–11. ACM. DOI: http://doi.org/10.1145/3357155.3358459.

Bakker, B., Schumacher, G., Arceneaux, K., and Gothreau, C. (2022). Conservatives and liberals have similar physiological responses to threats. DOI: http://doi.org/10.17605/OSF.IO/D5G72.

Bakker, B. N., Schumacher, G., Gothreau, C., and Arceneaux, K. (2020). Conservatives and liberals have similar physiological responses to threats. Nature Human Behaviour, 4(6):613–621. DOI: http://doi.org/10.1038/s41562-020-0823-z.

Barbosa, S. D. J., Silva, B. S. d., Silveira, M. S., Gasparini, I., Darin, T., and Barbosa, G. D. J. (2021). Interação Humano-Computador e Experiência do Usuário. Autopublicação.

Boğa, M., Koyuncu, M., Kaça, G., and Bayazıt, T. O. (2022). Comparison of emotion elicitation methods: 3 methods, 3 emotions, 3 measures. Current Psychology, 42(22):18670–18685. DOI: http://doi.org/10.1007/s12144-022-02984-5.

Cowen, A. S., Keltner, D., Schroff, F., Jou, B., Adam, H., and Prasad, G. (2020). Sixteen facial expressions occur in similar contexts worldwide. Nature, 589(7841):251–257. DOI: http://doi.org/10.1038/s41586-020-3037-7.

Da Silva, T. H. C. T., Cavalcanti, M. D., De Sá, F. M. F., Marinho, I. N., Cavalcanti, D. D. Q., and Becker, V. (2022). Visualization of brainwaves using eeg to map emotions with eye tracking to identify attention in audiovisual workpieces. In Proceedings of the Brazilian Symposium on Multimedia and the Web, WebMedia '22, page 381–389, New York, NY, USA. Association for Computing Machinery. DOI: http://doi.org/10.1145/3539637.3557055.

de Sá, F., Cavalcanti, D., and Becker, V. (2023). Testes com usuários para análise de emoções em conteúdos audiovisuais utilizando EEG e eye tracking. In Anais Estendidos do XXIX Simpósio Brasileiro de Sistemas Multimídia e Web, pages 63–66, Porto Alegre, RS, Brasil. SBC. DOI: http://doi.org/10.5753/webmedia_estendido.2023.235663.

D'Amelio, T. A., Bruno, N. M., Bugnon, L. A., Zamberlan, F., and Tagliazucchi, E. (2023). Affective computing as a tool for understanding emotion dynamics from physiology: A predictive modeling study of arousal and valence. In 2023 11th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), pages 1–7. DOI: http://doi.org/10.1109/ACIIW59127.2023.10388155.

D'Avila Goldoni, D., M. Reis, H., and Jaques, P. A. (2023). Emoções na aprendizagem: Estimando a duração da confusão e aprimorando intervenções pedagógicas. Revista Brasileira de Informática na Educação, 31:1225–1247. DOI: http://doi.org/10.5753/rbie.2023.3433.

Fan, Y., Lam, J. C. K., and Li, V. O. K. (2021). Demographic effects on facial emotion expression: an interdisciplinary investigation of the facial action units of happiness. Scientific Reports, 11(1). DOI: http://doi.org/10.1038/s41598-021-84632-9.

Fernández-Aguilar, L., Navarro-Bravo, B., Ricarte, J., Ros, L., and Latorre, J. M. (2019). How effective are films in inducing positive and negative emotional states? a meta-analysis. PLOS ONE, 14(11):e0225040. DOI: http://doi.org/10.1371/journal.pone.0225040.

González-Rodríguez, M., Díaz-Fernández, M., and Pacheco Gómez, C. (2020). Facial-expression recognition: An emergent approach to the measurement of tourist satisfaction through emotions. Telematics and Informatics, 51:101404. DOI: http://doi.org/10.1016/j.tele.2020.101404.

Guimarães, P. D. and De Almeida Neris, V. P. (2024). A tool-supported approach to adapt web user interfaces considering the emotional state of the user. In Proceedings of the XXII Brazilian Symposium on Human Factors in Computing Systems, IHC '23, New York, NY, USA. Association for Computing Machinery. DOI: http://doi.org/10.1145/3638067.3638115.

Hu, X., Wang, F., and Zhang, D. (2022). Similar brains blend emotion in similar ways: Neural representations of individual difference in emotion profiles. NeuroImage, 247:118819. DOI: http://doi.org/10.1016/j.neuroimage.2021.118819.

Judge, T. A. and Robbins, S. P. (2017). Organizational behavior. Pearson.

Kutt, K., Drążyk, D., Bobek, S., and Nalepa, G. J. (2020). Personality-based affective adaptation methods for intelligent systems. Sensors, 21(1):163. DOI: http://doi.org/10.3390/s21010163.

Oakes, R. A., Peschel, L., and Barraclough, N. E. (2024). Inter-subject correlation of audience facial expressions predicts audience engagement during theatrical performances. iScience, page 109843. DOI: http://doi.org/10.1016/j.isci.2024.109843.

Pei, G., Li, H., Lu, Y., Wang, Y., Hua, S., and Li, T. (2024). Affective computing: Recent advances, challenges, and future trends. Intelligent Computing, 3. DOI: http://doi.org/10.34133/icomputing.0076.

Reisenzein, R., Hildebrandt, A., and Weber, H. (2020). Personality and Emotion, page 81–100. Cambridge Handbooks in Psychology. Cambridge University Press, 2 edition. DOI: http://doi.org/10.1017/9781108264822.009.

Saganowski, S., Komoszyńska, J., Behnke, M., Perz, B., Kaczmarek, L. u. D., and Kazienko, P. (2021). Emognition Wearable Dataset 2020. DOI: http://doi.org/10.7910/DVN/R9WAF4.

van Erven, R. C. G. S. and Canedo, E. D. (2023). Measurement of user's satisfaction of digital products through emotion recognition. In Proceedings of the XXII Brazilian Symposium on Software Quality, SBQS '23, page 62–71. ACM. DOI: http://doi.org/10.1145/3629479.3629488.

Vora, D., Garg, M., Kohale, O., and Phansalkar, N. (2023). Face recognition for efficient identification of domain experts at professional meetups. In 2023 International Conference on Advanced Computing Technologies and Applications (ICACTA), pages 1–6. DOI: http://doi.org/10.1109/ICACTA58201.2023.10392956.

Zhou, X., Hu, W., Liu, G.-P., and Pang, Z. (2022). Face recognition system based on ncslab for online experimentation in engineering education. In 2022 41st Chinese Control Conference (CCC), pages 4390–4394. DOI: http://doi.org/10.23919/CCC55666.2022.9902741.

Downloads

Published

2025-10-11

How to Cite

AGUIAR, G. O.; ESTEVES, J. P. D.; CAMPOS, L. M. C. de; NASCIMENTO, T. H.; PEREIRA JÚNIOR, C. X.; ARANHA, R. V. Exploring approaches to compare emotional responses to the same stimuli from different individuals. Journal on Interactive Systems, Porto Alegre, RS, v. 16, n. 1, p. 925–939, 2025. DOI: 10.5753/jis.2025.5421. Disponível em: https://journals-sol.sbc.org.br/index.php/jis/article/view/5421. Acesso em: 20 dec. 2025.

Issue

Section

Regular Paper