Quantifying Computational Thinking Skills: an Exploratory Study on Bebras Tasks
DOI:
https://doi.org/10.5753/jbcs.2025.3893Keywords:
Computational Thinking Assessment, Item Response Theory, Bebras ChallengeAbstract
Computational Thinking (CT) is a cognitive problem-solving approach commonly employed in the field of Computer Science. Over recent years, various strategies have emerged to promote CT awareness and understanding. Despite these initiatives, there has been a lack of quantitative analysis aimed at assessing CT as a cognitive skill among undergraduate students, particularly focusing on items designed for this purpose. In this study, our objective is to investigate the psychometric properties of CT questions as answered by novice Computer Science undergraduates. To achieve this, we selected a set of questions from the Bebras Challenge, an international competition designed to explore CT skills without requiring programming expertise. In pursuit of our goal, we utilized Item Response Theory (IRT) to scrutinize the difficulty and discrimination levels of these selected Bebras questions. Difficulty is related to how an examinee responds to an item, while discrimination measures how effectively an item can differentiate between individuals with higher and lower levels of knowledge. Our findings reveal several key insights: (i) Concerning the accuracy in predicting question difficulty, theoretical predictions achieved an accuracy rate ranging from 53% to 58% when compared to empirical data. (ii) The Bebras Challenge questions predominantly exhibited two levels of difficulty, spanning from easy to medium. (iii) The questions displayed a spectrum of discrimination levels, encompassing low, moderate, and high discrimination, a crucial aspect for crafting effective assessment instruments. Additionally, we have gathered observed lessons from this exploratory study, regarding the design of questions that can contribute to reliably measure CT skills. These lessons contribute to understanding features influencing the reliable design of items for measuring CT skills. These insights serve as a resource for future research endeavors aimed at enhancing our understanding of assessing CT abilities.
Downloads
References
Baker, F. B. (2001). The basics of item response theory. ERIC. Book.
Baker, F. B. and Kim, S.-H. (2017). The basics of item response theory using R. Springer. DOI: 10.1007/978-3-319-54205-8.
Barr, V. and Stephenson, C. (2011). Bringing computational thinking to k-12: what is involved and what is the role of the computer science education community? Acm Inroads, 2(1):48-54. DOI: 10.1145/1929887.1929905.
Basu, S., Rutstein, D., Xu, Y., and Shear, L. (2020). A principled approach to designing a computational thinking practices assessment for early grades. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education, pages 912-918. DOI: 10.1145/3328778.3366849.
Bellettini, C., Lonati, V., Malchiodi, D., Monga, M., Morpurgo, A., and Torelli, M. (2015). How challenging are bebras tasks?: an irt analysis based on the performance of italian students. In Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education, pages 27-32. ACM. DOI: 10.1145/2729094.2742603.
Boom, K.-D., Bower, M., Arguel, A., Siemon, J., and Scholkmann, A. (2018). Relationship between computational thinking and a measure of intelligence as a general problem-solving ability. In Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education, pages 206-211. ACM. DOI: 10.1145/3197091.3197104.
Brennan, K. and Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 annual meeting of the American Educational Research Association, Vancouver, Canada. Available at: [link].
Csizmadia, A., Curzon, P., Dorling, M., Humphreys, S., Ng, T., Selby, C., and Woollard, J. (2015). Computational thinking - a guide for teachers. Project report, University of Southampton Institutional Repository. Available at: [link].
Dagienė, V. and Futschek, G. (2008). Bebras international contest on informatics and computer literacy: Criteria for good tasks. In International Conference on Informatics in Secondary Schools-Evolution and Perspectives, pages 19-30. Springer. DOI: 10.1007/978-3-540-69924-8_2.
Dagienė, V., Mannila, L., Poranen, T., Rolandsson, L., and Söderhjelm, P. (2014). Students' performance on programming-related tasks in an informatics contest in finland, sweden and lithuania. In Proceedings of the 2014 conference on Innovation & Technology in Computer Science education, pages 153-158. ACM. DOI: 10.1145/2591708.2591760.
Dagienė, V., Pelikis, E., and Stupurienė, G. (2015). Introducing computational thinking through a contest on informatics: Problem-solving and gender issues. Informacijos Mokslai/Information Sciences, 73. Available at: [link].
Dagienė, V., Sentance, S., and Stupurienė, G. (2017). Developing a two-dimensional categorization system for educational tasks in informatics. Informatica, 28(1):23-44. DOI: 10.15388/Informatica.2017.119.
Dagienė, V., Stupuriene, G., et al. (2016). Bebras-a sustainable community building model for the concept based learning of informatics and computational thinking. Informatics in Education-An International Journal, 15:25-44. Available at: [link].
De Ayala, R. J. (2013). The theory and practice of item response theory. Guilford Publications. Book.
Dhillon, D. (2003). Predictive models of question difficulty: A critical review of the literature. In Technical Report. AQA Centre for Education Research and Policy, Manchester, UK. Available at: [link].
Dolgopolovas, V., Jevsikova, T., Savulionienė, L., and Dagienė, V. (2015). On evaluation of computational thinking of software engineering novice students. In Proceedings of the IFIP TC3 Working Conference “A New Culture of Learning: Computing and next Generations", pages 90-99. DOI: 10.13140/RG.2.1.2855.9206.
Duncan, C. and Bell, T. (2015). A pilot computer science and programming course for primary school students. In Proceedings of the Workshop in Primary and Secondary Computing Education, pages 39-48. ACM. DOI: 10.1145/2818314.2818328.
Erthal, T. C. (1987). Manual de psicometria. Zahar. Book.
Giacomoni, C. H., de Lima Athayde, M., Zanon, C., and Stein, L. M. (2015). Teste do desempenho escolar: evidências de validade do subteste de escrita. Psico-USF, 20(1):133-140. DOI: 10.1590/1413-82712015200112.
Giordano, D. and Maiorana, F. (2014). Use of cutting edge educational tools for an initial programming course. In Global Engineering Education Conference (EDUCON), 2014 IEEE, pages 556-563. IEEE. DOI: 10.1109/EDUCON.2014.6826147.
Grover, S. and Pea, R. (2013). Computational thinking in k-12 a review of the state of the field. Educational Researcher, 42(1):38-43. DOI: 10.3102/0013189X124630.
Hambleton, R. K., Swaminathan, H., and Rogers, H. J. (1991). Fundamentals of item response theory, volume 2. Sage. Book.
Hubwieser, P. and Mühling, A. (2015). Investigating the psychometric structure of bebras contest: towards mesuring computational thinking skills. In Learning and Teaching in Computing and Engineering (LaTiCE), 2015 International Conference on, pages 62-69. IEEE. DOI: 10.1109/LaTiCE.2015.19.
Hutz, C. S., Bandeira, D. R., and Trentini, C. M. (2015). Psicometria. Artmed Editora. Book.
Ilic, U., Haseski, H. .I., and Tugtekin, U. (2018). Publication trends over 10 years of computational thinking research. Contemporary Educational Technology, 9(2):131-153. DOI: 10.30935/cet.414798.
ISTE (2011). Operational definition of computational thinking for k-12 education. Available at: [link]
Kalelioglu, F., Gülbahar, Y., and Kukul, V. (2016). A framework for computational thinking based on a systematic research review. Baltic Journal of Modern Computing, 4(3):583. Available at: [link].
Korkmaz, Ö., Çakir, R., and Özden, M. Y. (2017). A validity and reliability study of the computational thinking scales (cts). Computers in Human Behavior. DOI: 10.1016/j.chb.2017.01.005.
Lockwood, J. and Mooney, A. (2017). Computational thinking in education: Where does it fit? a systematic literary review. Maynooth University. DOI: 10.48550/arXiv.1703.07659.
Lockwood, J. and Mooney, A. (2018). Developing a computational thinking test using bebras problems. European Conference on Technology Enhanced Learning - EC-TEL 2018. Available at: [link].
Lonati, V., Malchiodi, D., Monga, M., and Morpurgo, A. (2017). Bebras as a teaching resource: classifying the tasks corpus using computational thinking skills. In Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education, pages 366-366. ACM. DOI: 10.1145/3059009.3072987.
Mannila, L., Dagienė, V., Demo, B., Grgurina, N., Mirolo, C., Rolandsson, L., and Settle, A. (2014). Computational thinking in k-9 education. In Proceedings of the working group reports of the 2014 on innovation & technology in computer science education conference, pages 1-29. ACM. DOI: 10.1145/2713609.2713610.
Matsuzawa, Y., Murata, K., and Tani, S. (2018). Multivocal challenge toward measuring computational thinking. In Open Conference on Computers in Education, pages 56-66. Springer. DOI: 10.1007/978-3-030-23513-0_6.
Mooney, A. and Lockwood, J. (2020). The analysis of a novel computational thinking test in a first year undergraduate computer science course. AISHE-J: The All Ireland Journal of Teaching and Learning in Higher Education, 12(1):1-27. DOI: 10.62707/aishej.v12i1.420.
Moreno-León, J., Román-González, M., and Robles, G. (2018). On computational thinking as a universal skill: A review of the latest research on this ability. In 2018 IEEE Global Engineering Education Conference (EDUCON), pages 1684-1689. IEEE. DOI: 10.1109/EDUCON.2018.8363437.
Morgan, D. L. (1996). Focus groups as qualitative research, volume 16. Sage publications. DOI: 10.4135/9781412984287.
Palts, T., Pedaste, M., Vene, V., and Vinikienė, L. (2017). Tasks for assessing skills of computational thinking. In ICERI2017 Proceedings, 10th annual International Conference of Education, Research and Innovation, pages 2750-2759. IATED. DOI: 10.1145/3059009.3072999.
Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. Basic Books, Inc. Book.
Papert, S. and Harel, I. (1991). Situating constructionism. Constructionism, 36(2):1-11. Available at: [link].
Pasquali, L. (2017). Psicometria: teoria dos testes na psicologia e na educação. Editora Vozes Limitada. Book.
Pohl, W. and Hein, H.-W. (2015). Aspects of quality in the presentation of informatics challenge tasks. In The Proceedings of International Conference on Informatics in Schools: Situation, Evolution and Perspectives—ISSEP 2015., page 21. Available at: [link].
Román-González, M. (2015). Computational thinking test: Design guidelines and content validation. In Proceedings of the 7th Annual International Conference on Education and New Learning Technologies (EDULEARN 2015), pages 2436-2444. Available at: [link].
Román-González, M., Pérez-González, J.-C., and Jiménez-Fernández, C. (2016). Which cognitive abilities underlie computational thinking? criterion validity of the computational thinking test. Computers in Human Behavior. DOI: 10.1016/j.chb.2016.08.047.
Seiter, L. and Foreman, B. (2013). Modeling the learning progressions of computational thinking of primary grade students. In Proceedings of the ninth annual international ACM conference on International computing education research, pages 59-66. ACM. DOI: 10.1145/2493394.2493403.
Selby, C. C. (2014). How can the teaching of programming be used to enhance computational thinking skills? PhD thesis, University of Southampton. 325pp. Available at: [link].
Selby, C. C. (2015). Relationships: Computational thinking, pedagogy of programming, and bloom's taxonomy. In Proceedings of the Workshop in Primary and Secondary Computing Education, WiPSCE '15, pages 80-87, New York, NY, USA. ACM. DOI: 10.1145/2818314.2818315.
Sherman, M. and Martin, F. (2015). The assessment of mobile computational thinking. Journal of Computing Sciences in Colleges, 30(6):53-59. DOI: 10.5555/2753024.2753037.
Tang, X., Yin, Y., Lin, Q., Hadad, R., and Zhai, X. (2020). Assessing computational thinking: A systematic review of empirical studies. Computers & Education, 148:103798. DOI: 10.1016/j.compedu.2019.103798.
Tikva, C. and Tambouris, E. (2021). Mapping computational thinking through programming in k-12 education: A conceptual model based on a systematic literature review. Computers & Education, 162:104083. DOI: 10.1016/j.compedu.2020.104083.
Tsarava, K., Moeller, K., Román-González, M., Golle, J., Leifheit, L., Butz, M. V., and Ninaus, M. (2021). A cognitive definition of computational thinking in primary education. Computers & Education, page 104425. DOI: 10.1016/j.compedu.2021.104425.
Van der Vegt, W. (2013). Predicting the difficulty level of a bebras task. Olympiads in Informatics, 7:132-139. Available at: [link].
Van der Vegt, W. (2018). How hard will this task be? developments in analyzing and predicting question difficulty in the bebras challenge. Olympiads in Informatics, 12:119-132. Available at: [link].
VANÍČEK, J., ŠIMANDL, V., and KLOFÁČ, P. (2021). A comparison of abstraction and algorithmic tasks used in bebras challenge. Informatics in Education, 20(4):717. DOI: 10.15388/infedu.2021.30.
Vendramini, C. M. M. and Dias, A. S. (2005). Teoria de resposta ao item na análise de uma prova de estatística em universitários. Psico-USF, 10(2):201-210. DOI: 10.1590/S1413-82712005000200012.
Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3):33-35. DOI: 10.1145/1118178.1118215.
Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, 366(1881):3717-3725. DOI: 10.1098/rsta.2008.0118.
Yagunova, E., Podznyakov, S., Ryzhova, N., Razumovskaia, E., and Korovkin, N. (2015). Tasks classification and age differences in task perception. case study of international on-line competition “beaver”. In Proc. of the 8th ISSEP Conf, pages 33-43. Available at: [link].
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Ana Liz Souto Oliveira, Wilkerson L. Andrade, Dalton Serey, Monilly Ramos Araujo Melo

This work is licensed under a Creative Commons Attribution 4.0 International License.

