Paul Pledger (BBC)

We were keen to explore multi-camera virtual production, comparing in-camera VFX with the established virtual studio workflow of chroma keying each camera over a graphics engine. This project involved making an episode of a BBC Persian film-review show called Aparat using an LED wall and floor (known as an LED volume) instead of the usual green-screen production method. Thanks to d&b solutions, we ran the pilot in the Immersive Technology Experience Centre at the London Science Museum.

Working with Sony and White Light, the BBC team adapted the existing VizRT scene for Aparat to work in Unreal Engine for use on the LED volume. We were able to pre-visualize our shots in Unreal Engine thanks to a useful Sony plug-in that highlighted the likelihood of experiencing an unwanted moiré effect on camera. This allowed us to finesse our camera and floor plans before the shoot, saving us time on set.

Novel workflow

One of the challenges with multi-camera virtual production is ensuring each camera sees its corresponding perspective in the LED volume. We were testing a system that changed the background in the LED volume at exactly the same time as the camera cut. This is an alternative to other systems that interleave the backgrounds for all the cameras in the LED volume simultaneously, with each camera’s shutter speed adjusted so it only sees its intended background. Such systems are elegant, but limited to four cameras, and the people on set see all the backgrounds blended together, which can be distracting. Sony’s new workflow offers seamless real-time switching between multiple cameras regardless of the number of cameras.

Switching the background on the camera cut worked well for our pilots, but working this way means the camera operators can’t see the background for their angle when their camera isn’t on air. Sony overcomes this by interleaving a blue or green frame in the LED volume and using an alternate blue/green screen feed from each camera (running at 100 Hz) to chroma key it with its corresponding graphics engine, generating a preview feed for the camera viewfinder. Doing this results in the LED wall appearing slightly blue or green to the naked eye but this didn’t seem to bother our presenter or guests.

Enjoyable experience

The BBC team really enjoyed working with virtual production and everyone agreed the programme was enhanced by using this workflow. Our presenter and guests looked like they were part of the virtual scene, as opposed to being cut out and inserted into it. Instead of lighting for chroma key, our team could focus on lighting artistically to achieve the desired mood. A few of our studio lights spilled onto the LED wall but this would be minimized if working with a slightly larger LED volume.

The presenter could see the editorial graphics and interact with down-the-line guests without having to imagine where they were in the virtual set. This resulted in a more natural feeling show on camera. Everyone on set found the experience to be much more enjoyable compared to working in a green-screen studio.

After the pilot, we pushed the technology even further by experimenting with XR set extensions to expand our virtual scene beyond the LED volume and we also looked at teleporting guests into the set from other locations. Colour matching and smooth camera tracking were challenging but we achieved some interesting results.

It was great to see all the partners working together and supporting one another to further our collective understanding of virtual production. The Aparat pilot looked fantastic, and the week was a valuable learning experience for everyone involved!

 

This article was first published in the September 2024 issue of tech-i magazine.

 

Latest news