0.0.0.0
ORJIN.DEV
VISUAL TECHNOLOGY
06
S.S.O.R.
DOMAIN
OVERVIEW
WORKFLOW
REAL-TIME SYSTEM
OUTPUT
INTERATIVE VISUAL SYSTEM
ROLE
RT VISUAL TECHNOLOGIST
TIME/PLACE
JUN 2024/LOS ANGELES
COLLABORATOR
ANA REYES CID
SOFTWARE

She Sprang Out Red integrates real-time camera feedback visuals into a traditional theater setup. The project explores how live visual feedback can enhance performance spaces by synchronizing projected visuals with motion. The system was installed at Highways Theater in Santa Monica, CA, where it was used to augment live performances with time-coded projections.
​​​​
​
​
​
​
As the Visual Technologist, I was responsible for developing a time-coded motion feedback projection system that was integrated into the existing theater infrastructure. This involved capturing live camera input and transforming it into dynamic visuals that interacted with performers on stage. My role included both designing the system and ensuring its smooth integration into the theater’s setup.
ROLE
TESTING
​
INITIAL FILTER

INTEGRATION
CAMERA FEEDBACK
RT SYSTEM
SIGNALS
QLAB TRIGGER
VISUAL FILTER
CONTENT
VISUAL PROCESSING
TIMECODE AUTOMATION
SITE CALIBRATION
OUTPUT
PROJECTION


INTEGRATION
​
​
Camera input
+​
TD feedback system​​
+​
final hardware specs​
+​​
TD to Qlab (OSC)
SITE CALIBRATIONS


OUTPUT
LIVE MONITOR
​
​
The final show of SSOR integrates real-time motion feedback visuals into a 15-minute immersive theater performance, with the visuals appearing in the last five minutes. Three dancers, performing with fabric, have their movements captured and transformed into abstract projections on a 2D canvas, subtly reflecting their 3D reality. The visuals are synchronized with six time-coded intensity changes from the audio track, blending fluid motion, sharp contrasts, and layered abstractions. This dynamic interplay between live performance and projection enhances the spatial depth of the stage, creating a seamless fusion of physical and digital elements for an evocative finale.
OUTPUT
SHOW CLIPS
​
​
The final show of SSOR integrates real-time motion feedback visuals into a 15-minute immersive theater performance, with the visuals appearing in the last five minutes. Three dancers, performing with fabric, have their movements captured and transformed into abstract projections on a 2D canvas, subtly reflecting their 3D reality. The visuals are synchronized with six time-coded intensity changes from the audio track, blending fluid motion, sharp contrasts, and layered abstractions. This dynamic interplay between live performance and projection enhances the spatial depth of the stage, creating a seamless fusion of physical and digital elements for an evocative finale.



