ARFood: an augmented-reality food diary app for asynchronous collaborative interaction
DOI:
https://doi.org/10.5753/jis.2024.4346Keywords:
Augmented Reality, Collaboration, Food Recognition, NutritionAbstract
This work presents the development and evaluation of ARFood, a mobile app for cooperation between nutritionists and patients through records in a food diary, including Augmented Reality resources, Computer Vision and Artificial Intelligence for food recognition, and asynchronous collaboration. We used Unity to create the app, integrating different libraries such as LogMeal for food recognition, EDAMAM for nutritional analysis, Vuforia for augmented reality interaction, and Firebase for cloud data storage. We proceed with a pilot study with six nutritionist-patient pairs to validate the technology acceptance. Mean score results showed a medium level of acceptance by nutritionists and a satisfactory level by the group of patients (3.54 x 4.38 for perceived ease of use and 3.33 x 3.75 for perceived usefulness, Likert scale). Despite this, nutritionists and patients (83.3%) reported that they would recommend using the application as a tool for recording and monitoring a food diary. Augmented reality and computer vision proved to be outstanding resources for a Nutrition app, showing a potential usage trend as long as the insertion of more digital content and a food recognition model to recognize regional cuisine.
Downloads
References
Alvarenga, M., Figueiredo, M., Timerman, F., and Antonaccio, C. (2018). Nutrição Comportamental, volume 2. Editora Manole. DOI: ISBN 978-8520456156.
Andersen, D., Baird, S., Bates, T., Chapel, D. L., Cline, A. D., Ganesh, S. N., Garner, M., Grant, B. L., Hamilton, K. K., Jablonski, K., et al. (2018). Academy of nutrition and dietetics: Revised 2017 standards of practice in nutrition care and standards of professional performance for registered dietitian nutritionists. Journal of the Academy of Nutrition and Dietetics, 118(1):132–140. DOI: https://doi.org/10.1016/j.jand.2017.10.003.
Billinghurst, M., Clark, A., and Lee, G. (2015). A survey of augmented reality. Foundations and Trends® in Human–Computer Interaction, 8(2-3):73–272. DOI: https://doi.org/10.1561/1100000049.
Bite AI (2021). Bite AI - food recognition api. Available at: [link] (Accessed: 22 Jul 2024).
Brazilian Institute for Geography and Statistics (2020). Um em cada quatro adultos do país estava obeso em 2019; atenção primária foi bem avaliada. Available at: [link] (Accessed: 22 Jul 2024).
Burova, A., Mäkelä, J., Heinonen, H., Palma, P. B., Hakulinen, J., Opas, V., Siltanen, S., Raisamo, R., and Turunen, M. (2022). Asynchronous industrial collaboration: How virtual reality and virtual tools aid the process of maintenance method development and documentation creation. Computers in Industry, 140:103663. DOI: https://doi.org/10.1016/j.compind.2022.103663.
Calorie Mama (2021). Calorie mama food AI - food image recognition and calorie counter using deep learning. Available at: [link] (Accessed: 22 Jul 2024).
Cervato-Mancuso, A. M., Tonacio, L. V., Silva, E. R. d., and Vieira, V. L. (2012). A atuação do nutricionista na atenção básica à saúde em um grande centro urbano. Ciência & Saúde Coletiva, 17:3289–3300. DOI: https://doi.org/10.1590/S1413-81232012001200014.
Chuah, S. H.-W. (2018). Why and who will adopt extended reality technology? literature review, synthesis, and future research agenda. Literature Review, Synthesis, and Future Research Agenda (December 13, 2018). DOI: https://doi.org/10.2139/ssrn.3300469.
Clarifai (2021). AI-Driven Food Model - FoodAI for recognition - Clarifai. Available at: [link] (Accessed: 22 Jul 2024).
Cordeil, M., Dwyer, T., Klein, K., Laha, B., Marriott, K., and Thomas, B. H. (2017). Immersive collaborative analysis of network connectivity: Cave-style or head-mounted display? IEEE Transactions on Visualization and Computer Graphics, 23(1):441–450. DOI: https://doi.org/10.1109/tvcg.2016.2599107.
Davis, F. D., Bagozzi, R. P., and Warshaw, P. R. (1989). User acceptance of computer technology: a comparison of two theoretical models. Management science, 35(8):982–1003. DOI: https://doi.org/10.1287/mnsc.35.8.982.
EDAMAM (2021). Edamam - food database api, nutrition api and recipe api. Available at: [link] (Accessed: 22 Jul 2024).
Elbamby, M. S., Perfecto, C., Bennis, M., and Doppler, K. (2018). Toward low-latency and ultra-reliable virtual reality. IEEE Network, 32(2):78–84. DOI: https://doi.org/10.1109/MNET.2018.1700268.
Ens, B., Lanir, J., Tang, A., Bateman, S., Lee, G., Piumsomboon, T., and Billinghurst, M. (2019). Revisiting collaboration through mixed reality: The evolution of groupware. Interna tional Journal of Human-Computer Studies, 131:81–98. DOI: https://doi.org/10.1016/j.ijhcs.2019.05.011.
Fadhil, A. (2019). Comparison of self-monitoring feedback data from electronic food and nutrition tracking tools. CoRR, abs/1904.08376. DOI: http://arxiv.org/abs/1904.08376.
Fan, J., Beuscher, L., Newhouse, P., Mion, L. C., and Sarkar, N. (2018). A collaborative virtual game to support activity and social engagement for older adults. In Universal Access in Human-Computer Interaction. Methods, Technologies, and Users, pages 192–204. Springer International Publishing. DOI: https://doi.org/10.1007/978-3-319-92049-8_14.
Firebase (2021). Firebase. Available at: [link] (Accessed: 22 Jul 2024).
Flutter (2021). Flutter - build apps for any screen. Available at: [link] (Accessed: 22 Jul 2024).
Foodai (2021). Foodai - state-of-the-art food image recognition technologies. Available at: [link] (Accessed: 22 Jul 2024).
Freitas, C. N. C., Cordeiro, F. R., and Macario, V. (2020). My food: A food segmentation and classification system to aid nutritional monitoring. In 2020 33rd SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), pages 234–239. DOI: https://doi.org/10.1109/SIBGRAPI51738.2020.00039.
Fuhrmann, A., Loffelmann, H., Schmalstieg, D., and Gervautz, M. (1998). Collaborative visualization in augmented reality. IEEE Computer Graphics and Applications, 18(4):54–59. DOI: https://doi.org/10.1109/38.689665.
García, A. S., Fernando, T., Roberts, D. J., Bar, C., Cencetti, M., Engelke, W., and Gerndt, A. (2019). Collaborative virtual reality platform for visualizing space data and mission planning. Multimedia Tools and Applications, 78(23):33191–33220. DOI: https://doi.org/10.1007/s11042-019-7736-8.
Grandi, J. G. (2018). Collaborative 3D Interactions and Their Application on Virtual, Augmented and Mixed Reality Interfaces. PhD thesis, PPGC, Porto Alegre.
Grzegorczyk, T., Sliwinski, R., and Kaczmarek, J. (2019). Attractiveness of augmented reality to consumers. Technology Analysis & Strategic Management, 31(11):1257–1269. DOI: https://doi.org/10.1080/09537325.2019.1603368.
Guo, A., Canberk, I., Murphy, H., Monroy-Hernández, A., and Vaish, R. (2019). Blocks: Collaborative and persistent augmented reality experiences. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 3(3):1–24. DOI: https://doi.org/10.1145/3351241.
Holmberg, C., Klingberg, S., and Brembeck, P. (2021). The food diary as a pedagogical strategy in undergraduate nursing nutrition education: a student evaluation. Nurse Education Today, 98:104737. DOI: https://doi.org/10.1016/j.nedt.2020.104737.
Ionic (2021). Cross-platform mobile app development: Ionic framework. Available at: [link] (Accessed: 22 Jul 2024).
Irlitti, A., Smith, R. T., Von Itzstein, S., Billinghurst, M., and Thomas, B. H. (2016). Challenges for asynchronous collaboration in augmented reality. In 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pages 31–35. IEEE. DOI: https://doi.org/10.1109/ISMAR-Adjunct.2016.0032.
Leigh, J. and Johnson, A. (1996). Supporting transcontinental collaborative work in persistent virtual environments. IEEE Computer Graphics and Applications, 16(4):47–51. DOI: https://doi.org/10.1109/38.511853.
LogMeal (2021). Logmeal food ai - image api and restaurant check-out solutions. food detection and food tracking based on the most advanced deep learning. Available at: [link] (Accessed: 22 Jul 2024).
Marques, B., Silva, S., Rocha, A., Dias, P., and Santos, B. S. (2021). Remote asynchronous collaboration in maintenance scenarios using augmented reality and annotations. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pages 567–568. IEEE. DOI: https://doi.org/10.1109/VRW52623.2021.00166.
Rantzau, D. and Lang, U. (1998). A scalable virtual environment for large scale scientific data analysis. Future Generation Computer Systems, 14(3–4):215–222. DOI: https://doi.org/10.1016/s0167-739x(98)00025-9.
React Native (2021). React native - learn once, write anywhere. Available at: [link] (Accessed: 22 Jul 2024).
Reisner-Kollmann, I. and Aschauer, A. (2020). Design and implementation of asynchronous remote support. In XChange
Reality (XCR), pages 9–11. CEUR-WS. DOI: [link].
Tait, M. and Billinghurst, M. (2015). The effect of view independence in a collaborative ar system. Computer Supported Cooperative Work (CSCW), 24(6):563–589. DOI: https://doi.org/10.1007/s10606-015-9231-8.
The Brazilian Federal Council of Nutritions (2020). Resolução nº 666, de 30 de setembro de 2020. Available at: [link] (Accessed: 22 Jul 2024).
Unity (2021). Unity real-time development platform | 3D, 2D, VR and AR engine. Available at: [link] (Accessed: 22 Jul 2024).
Unreal (2021). The most powerful real-time 3d creation tool - Unreal engine. Available at: [link] (Accessed: 22 Jul 2024).
Vlahovic, S., Skorin-Kapov, L., Suznjevic, M., and Pavlin-Bernardic, N. (2024). Not just cybersickness: Short-term effects of popular vr game mechanics on physical discomfort and reaction time. Virtual reality, 28(2):1–30. DOI: https://doi.org/10.1007/s10055-024-01007-x.
Vuforia AR (2021). Vuforia developer portal. Available at: [link] (Accessed: 22 Jul 2024).
Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., and Wesslén, A. (2012). Experimentation in software engineering. Springer Science & Business Media. DOI: https://doi.org/10.1007/978-3-642-29044-2.
Xu, J., Yang, L., Guo, M., Gao, F., and Yin, Y. (2023). Immersive museum: Design and develop an interactive virtual museum experience. In International Conference on Human-Computer Interaction, pages 152–161. Springer. DOI: https://doi.org/10.1007/978-3-031-35946-0_13.
Zhou, T., Zhu, Q., and Du, J. (2020). Intuitive robot teleoperation for civil engineering operations with virtual reality and deep learning scene reconstruction. Advanced Engineering Informat ics, 46:101170. DOI: https://doi.org/10.1016/j.aei.2020.101170.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 João Pedro Assunção Campos, Guilherme Afonso Madalozzo, Ana Luisa Sant'Anna Alves, Rafael Rieder
This work is licensed under a Creative Commons Attribution 4.0 International License.
JIS is free of charge for authors and readers, and all papers published by JIS follow the Creative Commons Attribution 4.0 International (CC BY 4.0) license.