Partner Journey #4 with ADAPTIT

In this interview Georgios Kliafas, from ADAPT IT, reflects on the team’s experience integrating XR technologies into live theatre. Throughout the project, the team explored AR theatre, balancing technical innovation with the practical realities of performance. In this conversation, Kliafas shares insights on team collaboration, design strategies, key development milestones, and lessons learned from testing and deploying XR solutions in a live, dynamic environment. He also discusses the potential future applications of these technologies and methods in cultural settings and beyond.

Over the course of the VOXReality project, your team worked on integrating XR technologies into the live theatre environment. What were the most valuable insights you gained about aligning technical innovation with the needs of a real-world performance?

AR theatre is a new and rather unexplored or undocumented domain, and there was little previous experience in the team on implementing such a performance. Therefore, we had to develop our own way of working with each other. One lesson that we face repeatedly in this regard and whose importance cannot be overstated, is that a shared language to coordinate across the team is crucial to clearly express expectations and requirements. That language may be not just textual, but also visual – it can require low fidelity mockups, role-playing, or interactive prototypesDevoting time for this team-building process early in the project is a worthy investment. We also learned that it was important to understand more deeply the existing workflows of theatre and try to incorporate the XR component as smoothly as possible in those pipelines, instead of suggesting an altogether new pipeline that was very robust but more inspired by software engineering rather than e.g. performative arts. For example, some tasks in the AR theatre pipeline had overlaps with tasks typically performed with commercial professional applications, and users expected commonalities or even interoperability. We would benefit from higher user engagement if we had prioritized such considerations  

You designed the AR Theatre application with modular and cross-platform principles in mind. How did these design choices prove beneficial during development and testing?

This approach shined during the rehearsal period, which is a short period of high intensity where a lot of design decisions materialize, and new information is revealed constantly. Our design approach helped us perform fast iterations and examine alternative setups very quickly during that phase. A concrete example regarding the cross-platform benefits is the fact that we produced quickly and easily a non-AR version of the audience app for internal use among the production crew to monitor the services in real-time. Another example, this time regarding the modularity, is the fact that we could easily put components together to produce a new client in our system to handle the spatialized audio, which was not planned initially, yet arose as a need with the evolution of the play.

Across the different iterations of the AR Theatre application, which development phase or milestone do you feel best demonstrated the project’s potential or innovation?

The second iteration (Pilot two) was certainly the most important milestone, since it made future paths for adoption finally clear enough to lay out. We could concretely recognize needs for optimizing and streamlining the whole operations for AR theatre, not just the technical and user requirements of the software side. For example, viewing the users’ reactions and needs was instrumental in understanding how to best accommodate them during onboarding, during the play itself and after the experience. For example, we realized that some users had either very low digital skills, even lower than we anticipated, while others seemed to fully master the application, even requiring more functionalities. That gap was hard to predict, but also to cater for. This helped us re-evaluate the usefulness of each feature and prioritize potential extensions and upgrades to the system, while also sketching out new alternative designs for the clients.

Considering the challenges of synchronization, spatial tracking, and security, how did your team ensure that the final system performed reliably in the complex environment of a live production?

Exhaustive testing! Given the complexity and innovation of this experiment, it was through testing sessions with variable conditions that we revealed the limits of the technology. There were no best practices or clear guidelines to adopt – rather through trial and error we figured out what worked and what didn’t, and most importantly, how reliable. We meticulously recorded our process and findings for future reference, but transferability is not guaranteed without a more rigid methodology. For example, we noticed that the AR devices lost their tracking when the ambient light dropped too low during dramatic scene changes. We didn’t have access to a sensor to record ambient luminosity though in order to provide quantitative guidelines for future reference, only qualitative and observational tips were provided.

As the project concludes, how do you envision the technologies, methods, or lessons from VOXReality shaping your future XR work, either in cultural applications or in other sectors?

We are quite pleased with the technical side of the AR theatre project, we consider it a successful technological milestone, and we now think there is substantial potential and investigation areas ahead of us on the practical implementation. For example, we consider many user experience design questions still open, especially regarding usability and accessibility. This type of work would help unlock more of the existing value in the technical offering. We are also very excited to look into the scalability of the technology, as well as its transferability to other similar fields within performative arts.

Picture of Georgios Kliafas

Georgios Kliafas

Twitter
LinkedIn
Shopping Basket