Home > Vol. 34, No. 3

Applying Embodied Design Improvisation for Physical Interaction in Augmented and Virtual Reality
  • Ji-hye Lee : Department of Communications and Networking, Aalto University, Espoo, Finland
  • Lily Díaz-Kommonen : Department of Media, Aalto University, Espoo, Finland
  • Yu Xiao : Department of Communications and Networking, Aalto University, Espoo, Finland

Background This study focuses on the early steps of using embodied design improvisation to create augmented reality (AR) and virtual reality (VR) applications specifically for therapeutic physical activities. Embodied design improvisation is a generative technique to provoke embodied interaction. In three-dimensional digital spaces such as VR and AR, embodied interaction leads users to interact with objects and spatial environments that are simulated virtually. This paper notes that it is necessary to use embodied design improvisation as the first step for designing a physically embodied interaction experience in the VR space. This paper uses VR physical therapy as a case study to show how embodied design improvisation is used to develop design elements for engaging users in VR.

Methods We propose that designers naturally perceive the intended design, but simultaneously, they need a way to actually create that vision. Some design researchers have adopted embodied design improvisation to develop an effective process for designers to understand, think through and evaluate interactions during the design process. Based on previous research on embodied design improvisation, this paper more thoroughly applies the approach to the VR space, where physical actions are important.

First, we investigated the literature on embodied design improvisation and physical interactions in VR. Second, we conducted a workshop to explore diverse and intuitive body movements and combined them for storyboarding and prototyping with physical, video, and Wizard of Oz (WOz) techniques. These methods enabled us to both reveal and evaluate appropriate interactions for implementing physical interactions in AR/VR.

Results We discovered patterns of motions, gestures, and interactions that are most often tacitly employed, and we determined the time needed to build real practical structures. These results were reflected by designing VR prototypes and having health professionals evaluate them, specifically with regards to the effectiveness of therapeutic physical activities.

Conclusions We focused on the early stage of design: how designers can use embodied design improvisation to effectively create embodied interaction in AR/VR, specifically where physical interaction is important for effective results of physical therapy and education. This paper explains the concept and methodology of embodied design improvisation in which designers can adapt to create embodied interaction in physical interaction-based VR activities.

Augmented and Virtual Reality, Embodied Design Improvisation, Embodied Interaction, Prototype Development, Design Research .
pISSN: 1226-8046
eISSN: 2288-2987
Publisher: 한국디자인학회Publisher: Korean Society of Design Science
Received: 17 Dec, 2018
Revised: 02 May, 2019
Accepted: 16 May, 2019
Printed: 31, May, 2019
Volume: 32 Issue: 2
Page: 5 ~ 17
DOI: https://doi.org/10.15187/adr.2019.
Corresponding Author: Ji-hye Lee (jihyelee1129@hotmail.com)
PDF Download:


Lee, J., Diaz-Kommonen, L., & Xiao, Y. (2019). Applying Embodied Design Improvisation for Physical Interaction in Augmented and Virtual Reality. Archives of Design Research, 32(2), 5-17.

Copyright : This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted educational and non-commercial use, provided the original work is properly cited.

1. Introduction

Recently, three-dimensional digital environments, such as augmented reality (AR) and virtual reality (VR), have been emerging in the physical therapy realm. AR/VR environments simulate users’ physical activities and interactions in a virtual or imaginary environment to improve patients’ therapeutic outcomes.

Considering the characteristics of VR physical therapy, which requires diverse physical movements from users, this research focuses on how designers can adjust the interaction method, based on embodied design improvisation, to create engagement in VR physical therapy. As a case study, we focused on chronic pain management with physical activities in the AR/VR environment. Chronic pain patients are very diverse and have varied symptoms, so different activities are needed depending on the pain area (NIH, 2019). We conducted preliminary interviews with health professionals who stated that there are no strict protocols for chronic pain patients, but rather diverse activities that professionals suggest to encourage patients to actively move their bodies.

This study focuses on the early stage of designing physical interaction for AR/VR applications using embodied design improvisation as a design research tool. Embodied design improvisation is a generative technique to provoke embodied interaction. In three-dimensional (3D) digital spaces such as AR and VR, embodied interaction leads users to interact with objects and spatial environments that are simulated virtually. Accordingly, this embodied interaction conducted in AR and VR environments effectively leads to physical activity in the real world.

This paper notes that it is necessary to use embodied design improvisation as the first step for designing physically embodied interactions in the VR space. This paper examines physical therapy as a case study to show how embodied design improvisation is used to develop design elements in AR or VR. Because the investigation of body movement is a crucial aspect of embodied design improvisation, the next section discusses design methods that use body movement.

2. Literature Review
2. 1. Design Method using Body Movement

Many researchers have attempted to use body movements to develop new concepts in the design realm, especially since the emergence of the user-centred creative design method in the 1990s (Löwgren, 1995). Awareness of one’s body has been one of the primary components in the movement experience, but as Levisohn et al. (2011) noted, approaches for human-computer interaction (HCI) have still been lacking, while there is a growing interest to understand the body through computational interactions. As we gain a deeper understanding of the movement experience, we have a huge potential to change the user experience of body movement-based interactions. As the AR/VR environment offers users the ability to move and engage with interactive objects inside the simulated environment, body movement-based interactions are crucial.

To design body movement-based interactions, Brandt (2006) explored the generation of design ideas by providing a stage and props. Within the user-centred design tradition, researchers have argued that the use of drama and various props gives fruitful opportunities to directly engage users in the design process. The importance lies in not only ‘things to think with’ but also ‘things to act with’. To encourage ‘things to act with’, drama has been used to stimulate body movement as a valuable tool for staging between designers and users (Brandt, 2006).

Contemporary interface design still predominantly uses screens that rely on models borrowed from desktop and mobile computing, downplaying the body’s role (Levisohn et al., 2011; Pallota, 2009). Ubiquitous computing and 3D technology have extended the potential of the body’s role; however, designers continue to develop applications that are primarily based on their prior experience and knowledge with visual and aural content rather than other forms of sensory communication. While there are still limitations of the technology, this is mostly because designers have not been skilled enough to spread their realm of thinking beyond screen-based interactions to approaches involving embodied interaction (Levisohn et al., 2011).

Instead, there have been studies on developing body movements in mediums in which physical interaction is important, especially in the realm of 3D computer games (Segura et al., 2013). Many computer games require diverse settings for physical interactions for games that use the body. To make these diverse settings, Segura et al. (2013) noted that an integrated design approach is necessary for such games, especially to emphasise free and natural movements.

In this paper, we propose that while designers may naturally determine a needed design, they simultaneously must determine how to make that design a reality. Including the research of Sirkin et al. (2014), design researchers have adopted embodied design improvisation to develop an effective process for designers to understand, think through and evaluate interactions while designing. Based on previous research on embodied design improvisation, this paper examines in more depth the approach to the AR/VR space in which physical actions are crucial.

This paper first reviews literature and previous research on embodied design improvisation, physical interactions in AR and VR, and relevant design approaches. We then describe a workshop conducted to explore diverse and intuitive body movements and combine them with storyboarding and physical, video, and Wizard of Oz (WOz) prototyping techniques to both reveal and evaluate appropriate interactions for design.

2. 2. Embodied Design Improvisation

Brown (2008) notes that when designers’ sensibilities and methods correspond with users’ needs, technological feasibility, and a viable business strategy, this can convert into a valuable opportunity. Historically, designers did not play a role in the early phases of product development until the first half of the twentieth century; however, participating in the earlier phases of developing ideas and creating customer values is becoming more important in the designer’s role (Brown, 2008). Accordingly, creating ideas from generative techniques including embodied design improvisation has been investigated in the design research realm. Levisohn et al. (2011) and Sirkin et al. (2014) have discussed embodied design improvisation, which refers to “a generative and evaluative technique to elicit tacit knowledge about embodied experience” (Sirkin et al., 2014, 1). Over the course of several decades, theories of embodiment have been a central research topic in diverse disciplines such as cognitive science, media studies, performance, media arts, dance, and philosophy (Levisohn et al., 2011). Historically, it has been a critical alternative to the Cartesian philosophy of the separation of mind and body in which the mind is given superiority in the construction of experience (Levisohn et al., 2011). Apart from the long-standing Cartesian philosophy, a central point of embodiment is that the body is the foundation for the structure of mindful experience (Johnson, 2008).

Several neuroscientists and linguistic scholars including Barsalou (1999) and Damasio (1994) have asserted that emotions are produced within the body before it is expressed as feelings within the brain. The scientific community, including researchers in cognitive science and recently HCI, has conducted most of the research about embodiment (Levisohn et al., 2011). Scientists have mainly investigated and examined the role of the body, specifically its haptic qualities within the scientific paradigm; however, some scientists have explored how to apply this understanding of embodiment to construct a live experience.

Instead, this paper focuses on body-based practices, specifically embodied improvisation influences on design activity, which we emphasize as embodied design improvisation. Embodied design improvisation is a term proposed by Sirkin et al. (2014) in the context of design thinking but with a focus on behavioural experiments and designer adaptation. Similarly, many designers have regarded bodystorming as an important generative ideation method. Bodystorming is a user-centred design (UCD) technique that is an effective tool for creating a quick and intuitive guideline for 3D products or a 3D environment. It allows participants to physically act in a given situation with the aim of developing and innovating. In the bodystorming method, role-playing is regarded as important because the process interweaves the ideas from imagining and performing.

Generally speaking, bodystorming allows the user to interact and role-play the scenario in diverse prototyping environments including not only the actual service location but also office or lab spaces equipped with mock-ups (Bolestsis et al., 2017). Bodystorming attempts to place the user at the location in which the real situations occur to obtain valuable feedback about the experiences of the users.

However, bodystorming is a rather user-centred ideation method, whereas embodied design improvisation combines storyboarding, physical and video prototyping, WOz techniques, and crowdsourced experimentation to both reveal and evaluate appropriate interactions for designers (Sirkin et al, 2014). Embodied design improvisation is rather a method of organizing ideas based on the combination of ideas generated by several techniques. Brown (2008) notes that organizing ideas in the design process is the result of significant effort augmented by a process of finding and iteration from prototyping to refining. Bodystorming has been applied as a design research technique for generating intuitive ideas in the design process. Its process aims to span empathy work, ideation, and immediate prototyping in small groups of users through physically experiencing situations so as to help designers derive new or unexpected ideas (Witthoft et al., 2010, Oulasvirta et al., 2003). However, it focuses on the process itself more than the results. Bodystorming is a useful design method for generating ideas, but it limits the designer’s possibility to reflect on his or her own sensibilities and creativity and combine them with the users’ needs. In contrast to bodystorming, embodied design improvisation highlights the designer’s opportunity to combine and reorganize ideas obtained from users’ physical activities and the resulting ideas. Embodied design improvisation is a design process for regenerating ideas from behavioural experiments.

Table 1
Bodystorming and Embodied Design Improvisation

Bodystorming Embodied design improvisation
Methods Role-playing the service scenario in a simulated service environment, using props and mock-ups Combines storyboarding, physical and video prototyping, WOz techniques, and crowdsourced experimentation to both reveal and evaluate appropriate interactions for designers
Main inputs for design Body movement Body movement, emotional feedback, tacit knowledge of designers

Embodied design improvisation is a collaborative method for designers to rearrange and organize data obtained from diverse settings, mainly on physical body movements and emotions. As mentioned previously, designers understand what should be designed. However, designers simultaneously need a method to produce an actionable system. For physical interaction-based products or services, designers need to understand, think through, and evaluate user interactions during the design process. In this sense, embodied design improvisation offers designers a way to combine, rearrange, and find interaction patterns for building functional systems from storyboarding, physical and video prototyping, WOz, and crowdsourced experimentation. This method is, therefore, not only about body movement ideation but also about organising ideas based on body movement ideation.

This paper explores the potential relationship between improvisation and design, examining how design can benefit from improvisation. The paper argues that improvisation can build perspectives and skills that are critical for designers, such as collaborating creatively, fostering innovation, supporting spontaneity, learning through errors, and presenting ideas. The paper reviews the use of improvisation activities by designers in a case study on the implications of therapy in AR/VR.

3. Creating Physical Interaction for AR/VR Application
3. 1. AR/VR Environment

Among current technologies that use augmented awareness of the body, AR/VR offers opportunities for overcoming challenges by developing new modes of interaction that are more in line with natural movement-based interaction and that also incorporate the sensory faculties (Levisohn, et al., 2011). This paper applies the method for both AR and VR because our method can be used for all modes that combine a user’s physical interaction with an experience in a 3D space, but all of these technologies that are aware of the body have that advantage. Interaction means a fruitful activity of a human embodied in a context in which it is experienced (Diaz et al., 2009). The experience of the user is delivered through all their senses simultaneously to convey understanding (Diaz et al., 2009). Thus, a sensual experience represents how each person perceives systems and objects and how a person’s experiences are woven together in the context and flow of the user’s interaction with 3D systems. In this sense, the 3D user interaction is the user’s interaction with virtually created 3D objects, environments, or information in the physical or virtual space (Bowman et al., 2008).

The AR/VR environment is defined as a 3D space generated by computing technology (Baus et al., 2014). Specifically, VR covers all environmental elements detaching the user (who is wearing a Head-Mounted Display (HMD)) from the real environment.

Generally speaking, 3D spaces and objects consist of visual and acoustic stimuli (Baus et al., 2014). Besides these stimuli, haptic, olfactory, or gustatory stimuli can also be considered (Baus et al., 2014; Sundren et al., 1992; Burdea & Coiffet, 1994; Kalawsky, 2000; Fuchs et al., 2006).

Figure 1 The 3D environment of Augmented and Virtual Reality (Image courtesy of www.autodesk.com)

As mentioned, AR and VR spaces are 3D environments in which the user’s physical movements and interactions with objects and environments are contained. In this sense, containing body movements for the design of AR/VR environments has the potential to enable designers to effectively design more intuitive physical interaction.

As a case study, we investigate how to address physical therapy in AR/VR environments in the next sections. Despite the growing importance of physical therapy in today’s ageing society and the significant increase of physical therapy applications, we still lack a design approach that considers a user-centred experience.

3. 2. Use of AR/VR for Physical Therapy

In most AR/VR environments for physical therapy, activities are conducted in an environment that mimics daily life to produce daily activities such as dressing, opening doors, grooming, and completing kitchen activities (Merians et al., 2002). VR applications in physical therapy also feature simulations that replicate the involvement of health professionals (Kilic et al., 2017). In this situation, health professionals cannot rigorously test their patients’ outcomes and health (Lalloo et al., 2015).

Merians et al. (2002) explain earlier research about physical therapy through VR and AR for patients who have experienced a stroke. This research case aimed to create a functional range of motion, movement speed, fractionation, and force production throughout an interactive and motivating environment in which practice can be intense and feedback can be produced as individualized treatments.

As demonstrated in Figure 2, therapy using AR/VR is mainly targeted at solving specific problems in daily lives. However, the aim of the use of AR and VR in the therapy realm has extended beyond solving specific therapeutic problems at the clinic to diversifying the user’s experience and giving pleasurable emotion. To potentially use AR/VR in the therapeutic realm, the experience should provide an expanded physical interaction that includes the entire body. This will require continued innovation and technology as well as the exploration of design research for embodied interactions that better support a human’s activity for therapeutic purposes.

Figure 2 Creation of experience that mimics patients’ daily lives for VR physical therapy (The University of Pittsburgh VR grocery store simulates the challenge of shopping for people who have balance disorders. Photo courtesy of the University of Pittsburgh Medical Center.)

For the ideation of AR/VR concepts for physical therapy through physical interaction using embodied design improvisation, this paper explores the impact of body movement in generating design ideas through a practical workshop. As a fundamental study for developing a new AR/VR physical therapy concept, the outcome of this paper is a scenario that uses a storyboard and physical and video prototyping using WOz. This paper aims to represent a process to approach the design of therapeutic physical activities for AR/VR.

In this paper, we specifically focus on chronic pain management as a case study, which requires diverse physical interactions. Chronic pain patients are very diverse with a variety of symptoms, and thus, diverse activities are attempted depending on the pain area (NIH, 2019). We conducted a preliminary interview with health professionals who stated that there are no strict protocols for chronic pain patients, but rather diverse activities are needed to make patients actively move.

4. Design Research Process Applying Embodied Design Improvisation

The following sections detail elements of our design research process in the context of interaction studies. Following Sirkin et al. (2014), we used embodied design improvisation as a design research tool through the following steps: 1) identify a research question, 2) storyboard people, activities, and environments, 3) improvise usage scenarios with experts, 4) video record to demonstrate usage, and 5) conduct a field study to confirm findings.

Figure 3 Embodied Design Improvisation process (Sirkin et al., 2014)

We conducted a workshop with the objective of documenting and rendering activities into a story format. In the workshop, three participants with chronic pain performed the activities and were recorded on video. All the activities were intended to take from two to three hours maximum, and the process was supposed to be carried out using improvisational theatre techniques.

4. 1. Identifying a Research Question

First, we identified a research question using the “What if?” scenario suggested as a design planning method by Stanford’s D. School, which can be applied differently according to the situation. We started from the question “What if the patient walks in the forest, not the clinic?” to make participants act diversely with imagined objects from nature. This research was conducted in Finland, and we received several patients’ feedback through casual conversation that they aim to go and walk in the forest for physical therapeutic treatment. Thus, our research set forests among various surroundings for the background of the AR/VR environment where possible, to encourage patients to do more physical activities and make freer movements. When requesting participants to imagine situations, we also asked them to use furniture or tools around the room.

4. 2. Storyboarding People, Activities, and Environments

In theatre performances and filmmaking, scripts are used to define all the details, including the props, lights, movements of the performers, and dialogue. The scripts are planned and written before the play or filmmaking commences. In spite of the existence of a script, a performance can be regarded as “an open structure to be filled with the participant’s own knowledge and experience” (Diaz et al., 2009, 83). Cortes et al. (1976) have noted that certain forms of improvisational theatre keep changing their key features based on the characters’ interactions during the collaborative play and that the same is true for the script. In modern society, examples of such open-structured scripts can also be found in live performances using interactive media or movement-based interactive dance performances. The performance utilizes motion capture technology so that the dancer’s movements are detected in real time and provide reflective feedback visually on the stage, contributing to the next movements (James et al., 2006). The idea of this movement-based interactive performance has been affected by artists who were influenced by Umberto Eco’s idea of ‘open work’ that refers to ‘a work in movement’ (Jennings, 1996). Several researchers including Sparacino et al. (2000) have already argued how open-structured movement could enable the generation of meaningful content. This method of creating content could be applied to diverse realms containing physical movement.

Based on the idea of open structure, we asked participants to act as physical patients and therapists, talk, and move around. Their activities and dialogues were recorded and organized into narrative episodes. We made storyboards based on them. The story was subdivided into storyboards on consultation, evaluation, and diagnosis, in that order, even though participants freely simulated the roles without consideration of the order.

Figure 4 Storyboard

Storyboarding is a technique broadly used in film, advertising, games, and theatre, but has recently also been applied to design and science fields (Alderman, 2008). In storyboards, it is important to capture specific moments showing the characters’ emotions, movements, expressions, gestures, sounds, feelings, conversations, surroundings, and artefacts. These elements are represented in a narrative flow. In making people work together, it helps the doctor and patient to more easily understand and determine their responsibilities.

Caroll (1999) mentions that a scenario can be narrated from different points of view, explaining how actions are prepared around a system and arranged in time. The activities in the workshop were acted from the patient’s perspective. The narrative for the final scenario combines separate episodes created based on participants’ personal experiences such as back pain or ankle pain. Regardless of the categories of pain, the episodes have similar processes, especially in the simulated situation in which the therapist meets the patient for the first time and checks the patient’s condition. Therefore, these parts of the consultation were combined. In this sense, we reorganized the storyboard and created an order in the chapters of consultation, evaluation, and diagnosis. However, each chapter has different and diverse stories that were created by several different patients and therapists.

4. 3. Improvising Usage Scenarios with Experts

To evaluate whether storyboarding effectively simulates therapeutic conditions and activities, we conducted a one-on-one in-depth interview with a health professional. Instead of conducting improvisation with experts, this study adopted a modified form of Sirkin et al.’s process. We created a storyboard in the form of a book containing a narrative of the activities of people in the roles of patient and therapist. We turned the pages of the storyboard one by one, reading the scripts written below. During the process, the expert nodded when she understood an episode and interrupted if she wanted to modify a specific scene or add her opinions.

Figure 5 A health professional reveals her opinions while researchers read the storyboard

We conducted an in-depth interview with a health professional in nursing and physical therapy who has been working in a hospital for more than 10 years. The professional was shown the scenario and asked to indicate what scenes in the storyboard were conducted meaningfully from a therapeutic perspective. The scenes that the professional selected were chosen, connected, and expanded. Subsequently, the order of the storyboard was rearranged.

More importantly, the health expert emphasised the significance of seamless storytelling to encourage patients’ continuous activities. Through it, the patient’s intuitive behaviours can be followed so as to improve the therapeutic results. In subsequent work, we would like to explore whether seamless storytelling can help in creating an effective interaction space and representation that leads to appropriate patient behaviour and therapy. However, for this paper, we have focused on creating an initial concept prototype. In the next few pages, we describe this prototype.

4. 4. Video Recording to Demonstrate Usage

In the following simulated play, we investigated how participants act to demonstrate usage based on the storyboard. At first, we let participants behave based on the storyboard’s order or change the order to find the most natural behaviours in a simulated forest environment. Participants acted from consultation, evaluation of pain, and diagnosis.

Figure 6 Video session

Besides consultation and diagnosis, there were diverse variations of evaluating behaviours in a forest. The participants imagined diverse nature objects such as trees, bushes, fruits, and berries. Static evaluation on a storyboard was changed to more dynamic activities with imagined objects. For instance, one stretching behaviour in the evaluation phase was diversified as 1) walking toward an apple tree, 2) stretching toward an apple hanging in a tree, and 3) taking the apple and receiving it in the arms as shown in Figure 7.

Figure 7 Diversified movements in the evaluation part of the video session. The order in the session: walking forward, stretching, and receiving
4. 5. Conducting a Field Study to Confirm Findings

Based on our previous investigation of ideas with video recording and storyboarding, we developed a prototype using Unity3D for the physical stage. The background was defined as a natural environment with grass and sky and the point area for patients’ behaviours was defined as an apple tree, so that patients would recognize immediately where to go. Natural interaction is necessary when users experience and interact with objects in a VR environment, however, at this stage, we created a background in VR and conducted natural interaction through the WOz technique. We made participants walk toward the tree and told them to stretch their arms toward the apple and pick it up. As previously indicated, this process enables a low-fidelity prototype to be tested with users and collect the knowledge gained. Through this process, we tried to demonstrate usage and conducted a field study to confirm the findings. We invited people to the field study and explored how people reacted to the demo through WOz and the think-aloud protocol.

Figure 8 VR implications with the process

To make the patients grab an apple, we used sensor-based gloves connected to a computer so that the patient could move his or her hands in reality and see virtual hands move in accordance with the real ones in the VR environment. Existing gloves can provide real-time feedback on the real hand’s gestures in the VR environment, but cannot provide haptic feedback such as the weight of the apple or a sense of touch.

Four elements in the application serve as key components of therapeutic activity:

• Background: Based on the behaviour of hands reaching upward, the action of the hands reaching out towards a tree or a ceiling was developed.

• Walk & Reach: A tree was chosen for a more natural and beautiful empathic environment.

• Stretch & Pull: Following such an idea, we imagined an apple hanging from a tree because the user’s hands can try to reach it.

• Getting Scores: Through programming, we can move the apple to different places where the patient can try to grab it so that we can calculate a therapeutic improvement score based on the apples grabbed by the user.

During the field study, we conducted empirical investigations and determined that different stages contain different assessments of the patient’s condition and health. More apples can be included, and the apples can blink and give sound cues such as pings as the user’s hand rolls over them.

However, due to the limitation in movement caused by the glove being connected to the PC, patients expressed difficulty and slight hesitation to walk naturally. Despite this limitation in moving freely, patients were satisfied with the intuitive representation of the apple tree, which the patients felt inclined to walk toward without any instruction, and the apple that inspired patients to act. When approaching the apple, the patient could hear a sound effect that indicated earning points and motivated the patient to try to earn as many points as possible.

Figure 9 Process of experience with time length

After the session, the time while using the prototype was refined as indicated below.

5. Discussion

Throughout the workshop and field study, we attempted to conduct physical interactions with diversified movements in a simulated forest environment. As a result, we found useful and natural activities for patients to use as therapeutic activities. These activities were then examined by health professionals and applied to the application. Based on participants’ activities, we produced the main activity and its artefacts using Unity 3D to create a quick prototype. Through the process, we determined patterns of motions, gestures, and interactions that are most often tacitly employed. These results were reflected in designing the VR prototype and evaluated with health professionals specifically with regards to the effectiveness of the therapeutic physical activities.

By adopting this method, the designer could begin to make obvious things that are known but are difficult to articulate for physical therapy patients in the AR/VR environment. This resulted in specific objects, activities, and environment. As a seamless story, this is a different result than previous efforts that simulated real-life daily experiences of physical pain patients. The application designer could expose her understandings to open discourse, allowing her to operationalize the resulting insights into practices that can be employed by others. Based on organising ideas through research questions, storyboarding, and expert evaluation, video recordings of physical body improvisation was conducted to demonstrate usage. Throughout the process, participants expanded simple evaluation activities conducted by therapists to more patient-centred, diversified movements. We reflected the activities in the demo with a more detailed time-frame and created VR content consisting of physical therapy in a natural environment with an apple tree. Although it is simple content, there are possibilities for patients to perform evaluation activities and further interactions in the AR/VR environment.

The embodied design improvisation that we conducted enabled us to create content and interaction for physical therapy using AR/VR technology with unexpected imagination and diversified movements. This activity transformed one simple evaluation behaviour conducted in a clinic into diversified physical movements and interactions in the AR/VR environment.

6. Conclusion

We have presented here one possible approach to embodied design improvisation for creating an AR/VR application that focuses on physical interaction. As a case study, we created a physical therapy application with VR technology.

Emerging developments in AR/VR allow huge possibilities for the user’s physical interaction and movement in a 3D digital space. In accordance with its development, this paper emphasizes designing activity with physical movement itself. We chose embodied design improvisation as our approach rather than bodystorming, which is currently adopted by many design researchers, because embodied design improvisation allows designer’s knowledge and sensibility to contribute more actively than bodystorming, which generates participants’ idea and reflects them as they are.

Throughout the process of embodied design improvisation, we discovered how participants expanded simple scenes to diversified movements and possible interactions with the surroundings. We could reflect these expanded scenarios created by participants’ physical improvisation into the demo with detailed settings of time and content.

Ultimately, this paper contributes to a method of creating AR/VR applications that stimulate physical movement not only effectively but also interestingly. We noted that existing physical therapy applications using VR/AR technology have mostly mimicked tasks done at the clinic or at home. With our approach, we could investigate how diversified and interesting movements and objects could be used in therapeutic activities. Ultimately, this paper aims to contribute to creating dynamic and unexpected content containing diversified movement and interaction for AR/VR applications.

  1. 1 . Alderman, I-M., & Beyers, D-J. (2008). Documentary Visions, Theological Insights. Teaching Theology & Religion, 12(3), 233-247. [https://doi.org/10.1111/j.1467-9647.2009.00525.x]
  2. 2 . Barsalou, L. W. (1999). Perceptual symbol systems. Behavioral and Brain Sciences, 22, 577-600. [https://doi.org/10.1017/S0140525X99002149]
  3. 3 . Baus, O., & Bouchard, S. (2014). Moving from virtual reality exposure-based therapy to augmented reality exposure-based therapy: a review. Human Neuroscience, 04 March.
  4. 4 . Bolestsis, C., Karahasanovic, A., & Fjuk, A. (2017). Virtual Bodystorming: Utilizing Virtual Reality for Prototyping in Service Design, AVR 2017, 10324, 279-288. [https://doi.org/10.1007/978-3-319-60922-5_22]
  5. 5 . Bowman, D., Coquillart, S., Froehlich, B., Hirose, M., Kitamura, Y., Kiyokawa, K., & Stuerzlinger, W. (2008). 3D User Interfaces: New Directions and Perspectives IEEE Computer Graphics and Applications, Retrieved from https://ieeexplore.ieee.org/document/4670098.
  6. 6 . Brandt, E. (2006). Designing Exploratory Design Games: A Framework for Pariticipation in Participatory Design?. Proceedings of the ninth Participatory Design Conference (PDC 2006), 57-66. [https://doi.org/10.1145/1147261.1147271]
  7. 7 . Brown, T. (2008). Design Thinking. Harvard Business Review, 6, 1-10.
  8. 8 . Caroll, J.M. (1999). Five reasons for scenario-based design. HICSS' 99 Proceedings of the Thirty-Second Annual Hawaii International Conference on System Sciences, 3(3), 3051.
  9. 9 . Cortes, F., Falcón, A., & Flores, J. (1976). The Cultural Expression of Puerto Ricans in New York: a Theoretical Perspective and Critical Review. Latin American Perspectives, 3(3), 117-152. [https://doi.org/10.1177/0094582X7600300307]
  10. 10 . Damasio, A. (1994). Descartes' error: Emotion, reason, and the human brain. New York: Crosset/Putnam.
  11. 11 . Diaz, L., Salmi, A., Reunanen, M. (2009). Role playing and collaborative scenario development. International Conference on Engineering Design, ICED' 09, 24-27 August 2009, 79-86.
  12. 12 . James, J., Ingalls, T., Qian, G., Olsen, L., Whiteley, D., Wong, S., & Rikakis, T. (2006). Movement-based Interactive Dance Performance. MM’06 Proceedings of the 14th ACM International Conference on Multimedia, 470-480. [https://doi.org/10.1145/1180639.1180733]
  13. 13 . Jennings, P. (1996). Narrative Structures for New Media: Towards a New Definition. Leonardo, 29(5), 345-350. [https://doi.org/10.2307/1576398]
  14. 14 . Johnson, M. (2008). What Makes a Body?. The Journal of Speculative Philosophy, 22(3), 159-169. [https://doi.org/10.1353/jsp.0.0046]
  15. 15 . Kilic, M-M., Muratli, O-C., & Catal, Cagatay. (2017). Virtual reality based rehabilitation system for Parkinson and multiple sclerosis patients. International Conference on Computer Science and Engineering (UBMK), 5-8 Oct. [https://doi.org/10.1109/UBMK.2017.8093401]
  16. 16 . Lalloo, C., Jibb, L., Rivera, J., Agarwal, A., & Stinson, J. (2015). "There's Pain App for That": Review of Patient-targeted Smartphone Applications for Pain Management. The Clinical Journal of Pain, 31(6), 557-563. [https://doi.org/10.1097/AJP.0000000000000171]
  17. 17 . Levisohn, A. & Schiphorst, T. (2011). Embodied Engagement: Supporting Movement Awareness in Ubiquitous Computing Systems. Ubiquitous Learning: An International Journal, 3(4), 97-111. [https://doi.org/10.18848/1835-9795/CGP/v03i04/40309]
  18. 18 . Löwgren, J. (1995). Applying design methodology to software development. Proc. Symp. Designing Interactive Systems (DIS' 95), 87-95. New York: ACM Press. [https://doi.org/10.1145/225434.225444]
  19. 19 . Merians, A-S., Jack, D., Boian, R., Tremaine, M., Burdea, G-C., Adamovich, S-V., Recce, M., & Poizner, H. (2002). Virtual Reality- Augmented Rehabilitation for Patients Following Stroke. Physical Therapy, 82(9), 898-915.
  20. 20 . NIH, Retrieved February, 2019, from https://nccih.nih.gov/health/pain/chronic.htm.
  21. 21 . Oulasvirta, A., Kurvinen, E., & Kankainen, T. (2003). Understanding contexts by being there: case studies in body storming. Personal and Ubiquitous Computing, 7(2), 125-134. [https://doi.org/10.1007/s00779-003-0238-7]
  22. 22 . Segura, M., Turmo, E., Vidal, L., & Rostami, A. (2016). Bodystorming for Movement-Based Interaction Design. Human Technology, 12(2), 193-251. [https://doi.org/10.17011/ht/urn.201611174655]
  23. 23 . Sirkin, D. & Ju, W. (2014). Using Embodied Design Improvisation as a Design Research Tool, International Conference on Human Behavior in Design (HBiD2014), 1-7.
  24. 24 . Sparacino, F., Davenport, G., & Pentland, A. (2000). Media in performance: Interactive spaces for dance, theater, circus, and museum exhibits. IBM Systems Journal, 39(3.4), 479-510. [https://doi.org/10.1147/sj.393.0479]
  25. 25 . Withoft, S., & Geehr, C. (2010). Bodystorming. Retrieved February, 2019, from https://dschool-old.stanford.edu/groups/k12/wiki/48c54/Bodystorming.html.