Virtual Production Terms You Need to Know

Josh Goossen
Visual Content Lead

In the ever-evolving world of virtual production, understanding the terminology is crucial. Whether you're an industry veteran or just diving into the virtual production realm, getting familiar with these key terms will help you navigate this exciting field.

To get you started we've cherry-picked a few key definitions from the VP Glossary that we've seen thrown around both on-set and in pre-production for ICVFX and XR Productions.

In no particular order, let's get into it.

Genlock: This term refers to the synchronization of digital triggers to ensure that frames stay in perfect harmony. In virtual production, precise frame synchronization is essential for a seamless and realistic final output.

Volume: The space where the performance is captured. Also refers to a nearly enclosed LED stage in which a volume of light is emitted.

On-Stage Camera operator surrounded by LED Volume

Tracking: Tracking involves determining the precise position and orientation of the camera in physical spaces and then integrating that physical data with the virtual elements used in the production. This process ensures that the virtual and real worlds seamlessly blend together.

Tile: A Tile signifies the four squares of LED panels within a cabinet. These LED tiles come together to form the LED wall, creating the canvas for virtual environments.

Pixel Pitch: The "pixel pitch" refers to the distance between each pixel on an LED wall. It's typically measured in millimetres and plays a crucial role in determining the wall's resolution and overall visual quality.

LED: LED, short for Light Emitting Diode, is the foundational technology behind the mesmerizing displays used in virtual production. LEDs are the lights that illuminate the virtual world and bring it to life.

Unreal Engine 5: Developed by Epic Games, Unreal Engine 5 (UE5) is a game engine that's not just for games. Studios like ours use it for the virtual elements of virtual production, connecting the real world with the virtual one. UE5 is a powerhouse of creativity and innovation.

Environment: In virtual production, the "environment" refers to the virtual space or level where the production unfolds. It's the canvas on which filmmakers work their magic.

Generative AI for VP: Generative AI is artificial intelligence capable of creating content based on prompts. In virtual production, it can generate environments or adapt existing ones based on specific requirements, opening up endless creative possibilities.

DMX: DMX, or Digital Multiplex, is the technology behind live lighting control within the virtual environment. It allows for the dynamic adjustment of lighting to create realistic scenes.

Set Extension: The virtual continuation of the content, or background that is on the LED wall— so the set extension goes beyond what is shown on the LED wall, making a seamless environment regardless of if the camera is pointed at the LED wall or not. This allows for wider shots, not limiting the focal length of your lens.

PreVis: Previsualization, or "PreVis" for short, is the phase before production where filmmakers plan their shots on the LED wall. It's a critical time for technical solutions, calibration, and organization.

Real-Time Rendering: Real-time rendering is the engine that powers extended reality (XR) experiences. It processes the virtual background in real-time, allowing for in-camera shooting of immersive scenes.

XR/AR: XR, or extended reality, is the umbrella term for technologies like mixed reality, augmented reality (AR), and virtual reality (VR). AR, which stands for augmented reality, adds digital elements to the real world, enhancing the user's perception.

Virtual Camera: A "virtual camera" behaves like a real-world camera within a real-time engine, such as UE5. It's used for shooting cinematic scenes or casting realistic backgrounds on LED walls. Virtual cameras can also be synced to tracked devices for lifelike movements.

Mo-Sys: Mo-Sys is a technology company specializing in motion tracking and automated camera movement. Their innovations are instrumental in achieving dynamic shots in virtual production.

For more information on Camera Tracking systems check out our post The Secret to XR Virtual Production

If you want to learn more about how to set-up the Mo-Sys Startracker system, Check out our video here.

Brompton Technology: Brompton Technology is another technology company, but their focus is on processors for LED walls. They unify and calibrate the LED panels, ensuring a seamless and high-quality visual experience on virtual production sets.

We recently partnered with the teams at Brompton to showcase their newest tool TrueLight ™ a new RGBW solution for subject lighting on immersive LED Volumes, check out our review of it here.

Meta-Human: Meta-Humans are remarkably lifelike digital humans. Originally developed by Weta Digital and now owned by Unreal Engine, these digital beings can be fully controlled in virtual environments, adding a touch of humanity to the digital world.

ICVFX: ICVFX, or In-Camera Visual Effects, is the art of capturing visual effects live in-camera, just as we do on an LED volume. It's a game-changer for achieving realistic effects in real-time.

Frustum: The area of the virtual world that the camera sees. The importance of the frustum is that the computer can focus its rendering power on the area picked up by the camera, and save processing power instead of having to render the entire world.

If there are any aspects of Virtual Production you want to see explored further whether here in blogs or in a youtube video reach out through our socials or shoot us an email we'd love to hear your thoughts!

For more information about the terms outlined here, check out our youtube video below!

Related reads