Live Production Based on the Concept
The concept of the event was to "expand the live performance with technology" through the live performance using WonderScreen, a transmissive screen. The catchphrase is the collaboration of "visual" projected on the WonderScreen and "song and dance" by the artists.
If the videos play too much of a leading role, it will not work as a live performance by the artists, but on the other hand, if they are not assertive enough, it cannot be called a collaboration. There was also a concern that simply playing the videos would reduce the significance of using a transmissive screen by half.
Based on the above, I created the concept of "live performance in a virtual space," mixing real and virtual space, and used video to make the main person stand out in each part of the performance.
Real-Time And Audio-Reactive Visual
Real-time rendered video is used with TouchDesigner.
Audio-reactive visualization through an audio interface from a mixer.
Audio-reactive visualization through an audio interface from a mixer.
The language used for the system is Python.