Learner Experience Evaluation: a Feasibility Study and a Benchmark

Authors

DOI:

https://doi.org/10.5753/jbcs.2025.4306

Keywords:

Learner eXperience, Evaluation, Model, LX, Qualitative analysis, Quantitative analysis, Student eXperience

Abstract

Learner eXperience (LX) is a concept derived from User eXperience (UX) and it can be defined as the perceptions, answers, and performances of learners interacting with Digital Communication and Information Technologies (DICTs). Evaluating the LX to obtain experiences that support and facilitate learning and knowledge mastery is important. Thus, we developed the LEEM to assess and improve the learner's experience using DICTs during learning. The LEEM is a generic evaluation model and can be used for any level of education; it can be used independently of the discipline and used with any educational technology. Therefore, this article presents a feasibility study to evaluate the LEEM steps and sentences from the perspective of potential users. Nineteen teachers from different levels of education participated in this study. The study results were analyzed and generated in a new version of LEEM. The results showed positive points of LEEM, such as a practical, objective, easy-to-use, and useful model. In addition, opportunities for improving some items and sentences of LEEM were obtained. The teachers also suggested adding a description at the ends of the scales to facilitate the response to the items. This study contributes to creating a body of knowledge about LEEM and analyzing its use, feasibility, and evolution. Moreover, realizing the lack of content and synthesized characteristics about the technology that evaluates the LX, we carried out a benchmark on the LX evaluation technologies identified from a Systematic Mapping Study (SMS) to compare them with LEEM, in addition to presenting important characteristics to be analyzed in these types of technology, such as the elements and types of LX assessment and whether there is tool support for this assessment.

Downloads

References

Anand, G. and Kodali, R. (2008). Benchmarking the benchmarking models. Benchmarking: An international journal, 15(3):257-291. DOI: 10.1108/14635770810876593.

Chapman, J. R., Seeley, E. L., Wright, N. S., Glenn, L. M., and Adams, L. L. (2016). An empirical evaluation of a broad ranging e-text adoption with recommendations for improving deployment success for students. Available online [link].

Corbin, J. et al. (1990). Basics of qualitative research grounded theory procedures and techniques. Available online [link].

Correa, C. M., de Freitas, G. V. M., dos Santos Eberhardt, A. L., and Silveira, M. S. (2021). From now on: Experiences from user-based research in remote settings. In Proceedings of the XX Brazilian Symposium on Human Factors in Computing Systems, IHC '21, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3472301.3484334.

da Silva, E. J. and Ziviani, H. E. (2018). Desenho e música no ensino de ihc: relato de experiência de uma aula sobre conceitos básicos da engenharia semiótica. In Anais Estendidos do XVII Simpósio Brasileiro sobre Fatores Humanos em Sistemas Computacionais, Porto Alegre, RS, Brasil. SBC. DOI: 10.5753/ihc.2018.4210.

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology.

Deodato, J. (2015). Evaluating web-scale discovery services: a step-by-step guide. Information Technology & Libraries, 34(2):19-75. DOI: 10.7282/T3K64M3R.

dos Santos, G. C., dos S. Silva, D. E., and C. Valentim, N. M. (2024). Feasibility study of a model that evaluates the learner experience: A quantitative and qualitative analysis. In Proceedings of the XXII Brazilian Symposium on Human Factors in Computing Systems, IHC '23, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3638067.3638119.

dos Santos, G. C., Silva, D. E., and Valentim, N. M. (2022). Um mapeamento sistemático da literatura sobre iniciativas que avaliam a experiência do aprendiz. In Anais do XXXIII Simpósio Brasileiro de Informática na Educação, pages 621-633, Porto Alegre, RS, Brasil. SBC. DOI: 10.5753/sbie.2022.224673.

dos Santos, G. C., Silva, D. E., and Valentim, N. M. (2023). Proposal and preliminary evaluation of a learner experience evaluation model in information systems. In Proceedings of the XIX Brazilian Symposium on Information Systems, SBSI '23, page 308–316, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3592813.3592919.

Dune, T., Bidewell, J., Firdaus, R., and Kirwan, M. (2016). Communication idol: Using popular culture to catalyse active learning by engaging students in the development of entertaining teaching and learning resources. Journal of University Teaching & Learning Practice, 13(5):15. DOI: 10.53761/1.13.5.3.

El Mawas, N., Tal, I., Moldovan, A.-N., Bogusevschi, D., Andrews, J., Muntean, G.-M., and Muntean, C. H. (2020). Investigating the impact of an adventure-based 3d solar system game on primary school learning process. Available online [link].

Fotaris, P., Mastoras, T., Leinfellner, R., and Rosunally, Y. (2016). Climbing up the leaderboard: An empirical study of applying gamification techniques to a computer programming class. Available online [link].

Huang, R., Spector, J. M., and Yang, J. (2019). Educational Technology a Primer for the 21st Century. Springer. DOI: 10.1007/978-981-13-6643-7.

Imamura, R. E. M. and Baranauskas, M. C. C. (2019). A framework for socio-enactive educational systems: Linking learning, design, and technology. In IHC '19: XVIII Brazilian Symposium on Human Factors in Computing Systems, IHC '19, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3357155.3358443.

ISO 9241-210 (2019). Ergonomics of human-system interaction — part 210: Human-centred design for interactive systems. Available online [link]. Acessed in 23/02/2022.

Jantzen, C., Vetner, M., and Bouchet, J. (2011). Oplevelsesdesign. Available online [link].

Kawano, A., Motoyama, Y., and Aoyama, M. (2019). A lx (learner experience)-based evaluation method of the education and training programs for professional software engineers. In Proceedings of the 2019 7th International Conference on Information and Education Technology, ICIET 2019, page 151–159, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3323771.3323789.

Lang, P. (1980). Behavioral treatment and bio-behavioral assessment: Computer applications. Available online [link].

Lima, D. T., Zacarias, R. O., de Souza, K. E. S., dos Santos, R. P., and da Rocha Seruffo, M. C. (2021). Analytical model for classifying areas of interest in interactive systems. In Proceedings of the XX Brazilian Symposium on Human Factors in Computing Systems, IHC '21, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3472301.3484357.

Lykke, M., Coto, M., Jantzen, C., Mora, S., and Vandel, N. (2015). Motivating students through positive learning experiences: A comparison of three learning designs for computer programming courses. Journal of Problem Based Learning in Higher Education, 3(2):80-108. DOI: 10.5278/ojs.jpblhe.v0i0.1130.

Martinelli, S. R. and Zaina, L. A. M. (2021). Learning hci from a virtual flipped classroom: Improving the students' experience in times of covid-19. In Proceedings of the XX Brazilian Symposium on Human Factors in Computing Systems, IHC '21, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3472301.3484326.

Muriana, L. a. M. and Baranauskas, M. C. C. (2021). Affecting user's self-esteem: Analysis under the self-determination theory perspective and design recommendations. In Proceedings of the XX Brazilian Symposium on Human Factors in Computing Systems, IHC '21, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3472301.3484331.

Nygren, E., Blignaut, A. S., Leendertz, V., and Sutinen, E. (2019). Quantitizing affective data as project evaluation on the use of a mathematics mobile game and intelligent tutoring system. Available online [link].

Reyna, J. and Meier, P. (2018). Using the learner-generated digital media (lgdm) framework in tertiary science education: a pilot study. Education Sciences, 8(3):106. DOI: 10.3390/educsci8030106.

Rosa, J. C. S., Rêgo, B. B. d., Garrido, F. A., Valente, P. D., Nunes, N. J., and Matos, E. (2020). Interaction design and requirements elicitation integrated through spide: A feasibility study. DOI: 10.1145/3424953.3426498.

Ruiz, J. and Snoeck, M. (2018). Adapting kirkpatrick's evaluation model to technology enhanced learning. In MODELS '18: ACM/IEEE 21th International Conference on Model Driven Engineering Languages and Systems, MODELS '18, page 135–142, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3270112.3270114.

Schmidt, M. and Huang, R. (2022). Defining learning experience design: Voices from the field of learning design & technology. TechTrends, 66(2):141-158. DOI: 10.1007/s11528-021-00656-y.

Shi, L. (2014). Defining and Evaluating Learner Experience for Social Adaptive E-Learning. In 2014 Imperial College Computing Student Workshop, volume 43 of OpenAccess Series in Informatics (OASIcs), pages 74-82, Dagstuhl, Germany. Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik. DOI: 10.4230/OASIcs.ICCSW.2014.74.

Shull, F., Mendoncça, M. G., Basili, V., Carver, J., Maldonado, J. C., Fabbri, S., Travassos, G. H., and Ferreira, M. C. (2004). Knowledge-sharing issues in experimental software engineering. Empirical Software Engineering, 9(1):111-137. DOI: 10.1023/B:EMSE.0000013516.80487.33.

Soloway, E., Guzdial, M., and Hay, K. E. (1994). Learner-centered design: The challenge for hci in the 21st century. Interactions, 1(2):36–48. DOI: 10.1145/174809.174813.

Stanley, D. and Zhang, J. (2018). Do student-produced produced videos videos enhance enhance engagement engagement and learning learning in the online online environmentenvironment. Online Learning, 22(2). DOI: 10.24059/olj.v22i2.1367.

Tabares, M. S., Vallejo, P., Montoya, A., Sanchez, J., and Correa, D. (2021). Seca: A feedback rules model in a ubiquitous microlearning context. In DATA'21: International Conference on Data Science, E-learning and Information Systems 2021, DATA'21, page 136–142, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3460620.3460745.

Venkatesh, V. and Bala, H. (2008). Technology acceptance model 3 and a research agenda on interventions. Decision sciences, 39(2):273-315. DOI: 10.1111/j.1540-5915.2008.00192.x.

Villas Boas, B. M. d. F. (2006). Avaliação formativa e formaçãoo de professores: ainda um desafio. Available online [link].

Wholin, C., Runeson, P., Host, M., Ohlsson, M. C., Regnell, B., and Wesslén, A. (2012). Experimentation in software engineering: an introduction. Springer Berlin, Heidelberg, page 236. DOI: 10.1007/978-3-642-29044-2.

Yeh, S.-W. and Chen, C.-T. (2019). Efl learners' peer negotiations and attitudes in mobile-assisted collaborative writing. Language Education & Assessment, 2(1):41-56. DOI: https://dx.doi.org/10.29140/lea.v2n1.100.

Downloads

Published

2025-02-20

How to Cite

dos Santos, G. C., Silva, D. E. dos S., & Valentim, N. M. C. (2025). Learner Experience Evaluation: a Feasibility Study and a Benchmark. Journal of the Brazilian Computer Society, 31(1), 132–153. https://doi.org/10.5753/jbcs.2025.4306

Issue

Section

Articles