Systematic mapping of technologies for supporting choreographic composition

Authors

DOI:

https://doi.org/10.5753/jis.2022.2635

Keywords:

Dance Creation, Choreography Composition, Choreography Design, Technology, Systematic Mapping

Abstract

Technology has increasingly occupied other areas of sciences and humanities, including art and dance. Over the years, initiatives to use technological applications in artistic performances have been observed and this research is developed regarding this context and the challenge of using technology to support the artist’s imagined creations. The systematic mapping of the literature carried out is part of a broad search for studies that portray the interdisciplinarity of these two universes, aiming to find technologies that support the choreographic composition process, focusing on tools that work together with the choreographer’s activities. The methodology consisted of using search terms in research repositories, which initially returned 635 publications, which were filtered by inclusion and exclusion criteria, to undergo further analysis. Eighteen tools were identified and explored in which the main applicability was the simulation of movements through graphic animation. From the operating mode of these applications, the challenges of the existing relationship between technology and the creation of dance were discussed. This study only incorporates technologies that act as a support tool by sharing the compositional effort, which creates the opportunity for future investigations into other ways of using technology in dance creation. The main contribution of this paper is identifying and classifying the main integration strategies of technology and dance composition, as well as summarizing the data and discussing its implications, been the identification of the lack of involvement of artists (end users) in the early stages of the development process the most relevant finding.

Downloads

Download data is not yet available.

References

Ahn, H., Kim, J., Kim, K., & Oh, S. (2020). Generative autoregressive networks for 3d dancing move synthesis from music. IEEE Robot. Autom. Lett., 5, 2: 3501–3508.

Alaoui, S. F., Carlson, K., & Schiphorst, T. (2014). Choreography as mediated through compositional tools for movement: constructing a historical perspective. In ACM Int. Conf. Proceeding Ser., pages 1–6.

Alemi, O. (2017). GrooveNet : real-time music-driven dance movement generation using artificial neural networks. Sigkdd-W, July, page 6.

Aristidou, A., Stavrakis, E., Papaefthimiou, M., Papagiannakis, G., & Chrysanthou, Y. (2018). Style-based motion analysis for dance composition. Vis. Comput., 34, 12: 1725–1737.

Bermudez, B., & Ziegler, C. (2014). Pre-choreographic movement kit. In ACM Int. Conf. Proceeding Ser., pages 7–12.

Bradford, J. H., & Côté-Laurence, P. (1995). An application of artificial intelligence to the choreography of dance. Comput. Hum., 29, 4: 233–240.

Brand, M., & Hertzmann, A. (2000). Style machines. In Proc. ACM SIGGRAPH Conf. Comput. Graph., pages 183–192.

Cabral, D., Fernandes, C., Carvalho, U., Correia, N., Silva, J., & Valente, J. (2011). Multimodal video annotation for contemporary dance creation. In Conf. Hum. Factors Comput. Syst., pages 2293–2298.

Cambridge Dictionary. SUPPORT | Significado, definição em Dicionário Inglês. [link] (accessed Jun. 28, 2021).

Calvert, T., Bruderlin, A., Dill, J., Schiphorst, T., & Welman, C. (1993). Desktop animation of multiple human figures. IEEE Comput. Graph. Appl., 13, 3: 18–26.

Calvert, T. W., Bruderlin, A., Mah, S., Schiphorst, T., & Welman, C. (1993). The evolution of an interface for choreographers. In Conf. Hum. Factors Comput. Syst. - Proc., 5, pages 115–122.

Calvert, T. W., Welman, C., Gaudet, S., Schiphorst, T., & Lee, C. (1991). Composition of multiple figure sequences for dance and animation. Vis. Comput., 7, 2–3: 114–121.

Calvert, T., Wilke, L., Ryman, R., & Fox, I. (2005). Applications of computers to dance. IEEE Comput. Graph. Appl., 25, 2: 6–12.

Camurri, A., Hashimoto, S., Ricchetti, M., Ricci, A., Suzuki, K., Trocca, R., & Volpe, G. (2000). Eyesweb: toward gesture and affect recognition in interactive dance and music systems. Comput. Music J., 24, 1: 57–69.

Carlson, G., Lundén, P., Lundin, M., Nyman, G., Rajka, P., & Ungvary, T. (1992). How to build a multimedial communication/creation system for music and human motion. Multimedia, 153–169.

Carlson, K., Schiphorst, T., & Pasquier, P. (2011). Scuddle: generating movement catalysts for computer-aided choreography. In Proc. 2nd Int. Conf. Comput. Creat. ICCC 2011, pages 123–128.

Carlson, K., Tsang, H. H., Phillips, J., Schiphorst, & T., Calvert, T. (2015). Sketching movement: designing creativity tools for in-situ, whole-body authorship. In ACM Int. Conf. Proceeding Ser., pages 68–75.

Carroll, E. A., Lottridge, D., Latulipe, C., Singh, V., & Word, M. (2012). Bodies in critique: a technological intervention in the dance production process. In Proc. ACM Conf. Comput. Support. Coop. Work. CSCW, pages 705–714.

Corness, G., & Schiphorst, T. (2013). Performing with a system’s intention: embodied cues in performer-system interaction. In Proceedings of the 9th ACM Conference on Creativity & Cognition, pages 156–164.

Crnkovic-friis, L., & Crnkovic-friis, L. (2015). Generative choreography using deep learning long short-term memory. In 7th Int. Conf. Comput. Creat. ICCC2016, pages 1–6.

Dania, A., Koutsouba, M., & Tyrovola, V. (2015). The ability of using symbols and its contribution to dance learning: the Laban Notation System. Choros International Dance Journal, [s. l.], 37–49.

Dong, R., Cai, D., & Asai, N. (2017). Nonlinear dance motion analysis and motion editing using Hilbert-Huang transform. In ACM Int. Conf. Proceeding Ser., Part F1286.

Du, X., Ma, H., & Wang, H. (2017). A knowledge-based intelligent system for distributed and collaborative choreography. In International Conference on Intelligent Robotics and Applications. Springer, Cham, pages 617–627.

Eiben, A. E., & Smith, J. E. (2015). Creating choreography with interactive evolutionary algorithms. In Introduction to Evolutionary Computing. Natural Com-puting Series. Springer, Berlin, Heidelberg, pages 215–222.

Felice, M. C., Alaoui, S. F., & MacKay, W. E. (2016). How do choreographers craft dance? Designing for a choreographer-technology partnership. In ACM Int. Conf. Proceeding Ser., 05-06-July.

Fernandes, C. (2013). The TKB project: creative technologies for performance composition, analysis and documentation. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), 7990 LNCS, 2010: 205–217.

Hsieh, C. M., & Luciani, A. (2005). Generating dance verbs and assisting computer choreography. In Proc. 13th ACM Int. Conf. Multimedia, MM 2005, pages 774–782.

Jacob, M., Zook, A., & Magerko, B. (2013). Viewpoints AI: procedurally representing and reasoning about gestures. In DiGRA 2013 - Proc. 2013 DiGRA Int. Conf. DeFragging GameStudies.

Jadhav, S., & Pawar, J. (2013). Art to SMart : automation for BharataNatyam choreography. In Proceedings of the 19th International Conference on Management of Data, pages 131–134.

Kar, R., Konar, A., & Chakraborty, A. (2015). Dance composition using microsoft Kinect. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), 9030: 20–34.

Kohno, Y., Soga, A., & Shiba, M. (2010). A system for motion synthesis of human body parts using a touch panel. In Proc. ACM SIGGRAPH Conf. Virtual-Reality Contin. Its Appl. to Ind., pages 145–146.

Konar, A., & Saha, S. (2018). Differential evolution based dance composition. Stud. Comput. Intell., 724: 225–241.

Krylov, D. I., & Samsonovich, A. V. (2019). Designing an emotionally-intelligent assistant of a virtual dance creator. Biologically Inspired Cognitive Architectures Meeting, Springer, Cham, pages 197-202.

Lapointe, F. J. (2005). Choreogenetics: the generation of choreographic variants through genetic mutations and selection. In Proc. 2005 Work. Genet. Evol. Comput., pages 366–369.

Lee, J., Kim, S., & Lee, K. (2018). Listen to dance: music-driven choreography generation using autoregressive encoder-decoder network. arXiv preprint arXiv:1811.00818.

Lee, H. Y., Yang, X., Liu, M. Y., Wang, T. C., Lu, Y. D., Yang, M. H., Kautz, J. (2019). Dancing to music. Adv. Neural Inf. Process. Syst., 32, November.

Lobo, L., & Navas, C. (2019). Arte da composição: teatro do movimento. LGE Editora.

Mironcika, S., Pek, J., Franse, J., & Shu, Y. (2016). Whoosh gloves: interactive tool to form a dialog between dancer and choreographer. In TEI 2016 - Proc. 10th Anniv. Conf. Tangible Embed. Embodied Interact., pages 729–732.

Nakazawa, M., & Paezold-Ruehl, A. (2009). DANCING, dance and choreography; an intelligent nondeterministic generator. In Proc. Richard Tapia Celebr. Divers. Comput. Conf. 2009 Intellect, Initiat. Insight, Innov., pages 30–34.

Parsifal. (2021). [link] (accessed Mar. 10, 2021).

Petersen, K., Feldt, R., Mujtaba, S., & Mattsson, M. (2008). Systematic mapping studies in software engineering. In 12th Int. Conf. Eval. Assess. Softw. Eng. EASE 2008.

Ribeiro, C., dos Anjos, R. K., & Fernandes, C. (2017). Capturing and documenting creative processes in contemporary dance. ACM Int. Conf. Proceeding Ser., vol. Part F1291.

Rossignac, J., Luffel, M., & Vinacua, À. (2012). SAMBA: steadied choreographies. Computational Aesthetics in Graphics, Visualization, and Imaging, 1–9.

Sagasti, F. (2019). Information technology and the arts: the evolution of computer choreography during the last half century. Dance Chronicle, 42, 1:1–52.

Schiphorst, T., Calvert, T., Lee, C., Welman, C., & Gaudet, S. (1990). Tools for interaction with the creative process of composition. In Conf. Hum. Factors Comput. Syst. - Proc., pages 167–174.

Singh, V., Latulipe, C., Carroll, E., & Lottridge, D. (2011). The choreographer’s notebook: a video annotation system for dancers and choreographers. In Proceedings of the 8th ACM Conference on Creativity and Cognition, pages 197-206.

Shapiro, A., Cao, Y., & Faloutsos, P. (2006). Style components. In Proc. - Graph. Interface, pages 33–39.

Sheppard, R. M., Kamali, M., Rivas, R., Tamai, M., Yang, Z., Wu, W., Nahrstedt, K. (2008). Advancing interactive collaborative mediums through tele-immersive dance (TED): a symbiotic creativity and design environment for art and computer science. In MM’08 - Proc. 2008 ACM Int. Conf. Multimedia, with co-located Symp. Work., pages 579–588.

Soga, A. (2019). Creation and live performance of dance and music based on a body-part motion synthesis system. In 17th ACM SIGGRAPH Int. Conf. Virtual-Reality Contin. its Appl. Ind., pages 3–4.

Soga, A., Shiba, M., & Salz, J. (2009a). Choreography composition and live performance on a Noh stage. In 8th Int. Conf. Virtual Real. Contin. its Appl. Ind., 1, 212, pages 317–318.

Soga, A., Endo, M., & Yasuda, T. (2001). Motion description and composing system for classic ballet animation on the web. In Proc. - IEEE Int. Work. Robot Hum. Interact. Commun., pages 134–139.

Soga, A., Umino, B., & Hirayama, M. (2009b). Automatic composition for contemporary dance using 3D motion clips: experiment on dance training and system evaluation. In Int. Conf. CyberWorlds, CW ’09, pages 171–176.

Soga, A., Umino, B., & Hirayama, M. (2020). Experimental creation of dance by professional choreographers using a body-part motion synthesis system. In Proc. Int. Conf. Cyberworlds, CW 2020, pages 117–120.

Soga, A., Umino, B., Yasuda, T., & Yokoi, S. (2006a). Automatic composition and simulation system for ballet sequences using 3D motion archive. In 2006 Int. Conf. Cyberworlds, CW’06, pages 43–49.

Soga, A., Umino, B., Yasuda, T., & Yokoi, S. (2006b). Web3D dance composer: automatic composition of ballet sequences. In ACM SIGGRAPH 2006 Res. Posters, SIGGRAPH 2006, page 3.

Soga, A., Umino, B., Yasuda, T., & Yokoi, S. (2007). Automatic composition and simulation system for ballet sequences. Vis. Comput., 23, 5: 309–316.

Soga, A., Yazaki, Y., Umino, B., & Hirayama, M. (2016). Body-part motion synthesis system for contemporary dance creation. In SIGGRAPH 2016 - ACM SIGGRAPH 2016 Posters, pages 1–2.

Sun, G., Wong, Y., Cheng, Z., Kankanhalli, M. S., Geng, W., & Li, X. (2021). DeepDance: music-to-dance motion choreography with adversarial learning. IEEE Trans. Multimed., 23: 497–509.

Tamai, M., Wu, W., Nahrstedt, K., & Yasumoto, K. (2008). A view control interface for 3D Tele-immersive environments. In 2008 IEEE International Conference on Multimedia and Expo. IEEE, pages 1101-1104.

Tang, T., Jia, J., & Mao, H. (2018). Dance with melody: an LSTM-autoencoder approach to music-oriented dance synthesis. In MM 2018 - Proc. 2018 ACM Multimed. Conf., pages 1598–1606.

Tang, T., Mao, H., & Jia, J. (2018). Anidance: real-time dance motion synthesize to the song. In MM 2018 - Proc. 2018 ACM Multimed. Conf., pages 1237–1239.

Umino, B., Soga, A., & Hirayama, M. (2014). Feasibility study for contemporary dance e-learning: an interactive creation support system using 3D motion data. In Proc. - 2014 Int. Conf. Cyberworlds, CW 2014, pages 71–76.

Venture, G., Yabuki, T., Kinase, Y., Berthoz, A., & Abe, N. (2016). MovEngine – developing a movement language for 3D visualization and composition of dance. 111: 91–116.

Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., & Wesslén, A. (2012). Experimentation in Software Engineering. Springer Berlin, Heidelberg.

Xue, L. (2020). Multimedia system and information modelling system for computer aided red dance creation with virtual reality. In Proc. 3rd Int. Conf. Intell. Sustain. Syst. ICISS 2020, pages 146–149.

Yalta, N., Watanabe, S., Nakadai, K., & Ogata, T. (2019). Weakly-supervised deep recurrent neural networks for basic dance step generation. In 2019 International Joint Conference on Neural Networks (IJCNN), IEEE, pages 1–8.

Yazaki, Y., Soga, A., Umino, B., & Hirayama, M. (2016). Automatic composition by body-part motion synthesis for supporting dance creation. In Proc. - 2015 Int. Conf. Cyberworlds, CW 2015, pages 200–203.

Yu, T., & Johnson, P. (2003). Tour jeté, pirouette: dance choreographing by computers. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), 2723, pages 156–157.

Zaman, L., Sumpeno, S., Hariadi, M., Kristian, Y., Setyati, E., & Kondo, K. (2020). Modeling basic movements of Indonesian traditional dance using generative long short-term memory network. IAENG Int. J. Comput. Sci., 47, 2: 262–270.

Zhang, S. (2020). Recent advances in the application of deep learning to choreography. In Proc. 2020 Int. Conf. Comput. Data Sci. CDS 2020, pages 88–91.

Zhou, L. (2014). Image mosaic technology of dance movements based on digital multimedia and Matlab implementation. Appl. Mech. Mater., 644–650: 4187–4191.

Downloads

Published

2022-09-06

How to Cite

DORNELAS, I. F.; SEABRA, R. D.; DE SOUZA, A. D. Systematic mapping of technologies for supporting choreographic composition. Journal on Interactive Systems, Porto Alegre, RS, v. 13, n. 1, p. 232–242, 2022. DOI: 10.5753/jis.2022.2635. Disponível em: https://journals-sol.sbc.org.br/index.php/jis/article/view/2635. Acesso em: 21 nov. 2024.

Issue

Section

Regular Paper