In our recent production of Jez Butterworth’s play Jerusalem, I took on the challenge of getting a live video stream from an on-stage ‘camcorder’ and mapping it back onto the set—and at reasonably low latency too.
The resulting system was reasonably simple and worked pretty well—with a little care and attention!
- iPod Touch running AirBeam
The iPod was taped onto the fold-out screen of a generic silver camcorder and blended in perfectly! AirBeam was worth every penny—reliable and with lots of options to configure the video stream. It outputs at
An old 802.11n router worked fine, though I had an AC router available just in case latency became an issue.
Taking the stream URL from AirBeam this nifty utility outputs over Syphon, a library that allows applications to share video content. It was a little temperamental if the stream cut out, so Chris knocked together an app in Automator to quit and relaunch in one go.
The gold standard for cueing content in live productions. We used a video cue with source from Syphon, sent to a surface masked to the caravan’s front face.
We considered a few different options for the camera, though AirBeam’s ease of use is what decided things in the end.
- Raspberry Pi + Pi Camera with motionEyeOS
This was fully functional but rather bulky.
Getting a reliable stream was difficult, especially as the HERO4 departs from the behaviour of earlier models and requires a ‘keep-alive’ packet to be sent at regular intervals.
Side note: the production was lit with 6 × Chauvet Rogue R2 Washes, 2 × R2 Spots, 8 × Chauvet Par-Quad 18s plus a load of 650 & 1k fresnels. I had lots of fun lighting the rave scenes!