Constructing an Innovative Interactive Experience of Stage Visual Design for Augmented Reality in Costume Performance
Publicado en línea: 05 jul 2024
Recibido: 09 mar 2024
Aceptado: 30 may 2024
DOI: https://doi.org/10.2478/amns-2024-1725
Palabras clave
© 2024 Ruoqi Shi., published by Sciendo
This work is licensed under the Creative Commons Attribution 4.0 International License.
Augmented reality (AR) technology has rapidly advanced across various domains, propelled by its robust interactive immersion and the seamless integration of real and virtual environments. However, its exploration and deployment in theatrical contexts remain limited. This study leverages the Kinect system to capture images during costume performances, employing algorithms for dynamic frame difference merging and human-computer interaction to detect performers’ body movements. Building on this, the study constructs a visually innovative stage for costume performances that enhances the interactive experience for the audience. Additionally, a multimodal emotion analysis model is utilized to assess audience emotions, demonstrating significantly higher accuracy and F1 scores compared to other emotion analysis models. This model effectively integrates speech, expression, and action, surpassing the performance of unimodal analyses in emotion recognition. Furthermore, the audience's experiential perception of stage lighting effects notably exceeds expectations (P=0.013 < 0.05), underscoring an enhanced interaction experience. This research substantiates the transformative potential of AR technology in stage design, offering audiences a more innovative visual and interactive experience, and serves as a valuable reference for future applications in this field.