As far as ICFVX go, LED walls, 3D models, live environments, and a dude in a reflective helmets are all topics that attract a lot of hype in virtual production. You know what doesn’t get that hype? What needs more hype? Deserves it even?
Camera tracking systems, without these, this whole machine stops moving… well, it doesn’t even start.
Without tracking systems, we’re left with a 3D environment, a camera, and no link between the two of them— no fancy camera moves, no parallax, no in-camera VFX so this stuff matters! Our goal when picking the best camera tracking system for each unique use-case in video production is to track the camera in real space and relate that movement to a virtual actor in 3D space.
Camera tracking systems are the engine that make XR Virtual Production (ICVFX) possible. Today we'll explain the two primary types of tracking systems; inside-out and outside-in. Both of these methods have their pros and cons, involving infrared (IR) cameras and reflective surfaces used as tracking points.
But, what makes them so unique?
Put simply, there's a small sensor on top of your studio camera that shoots a small light, invisible to the eye, measuring how long it takes for the light to bounce off one of the many tracking points positioned around your shooting area. This allows the tracking system to determine the relative position of your camera and map out any changes in position or camera movements. This camera sensor sends this data directly to your real-time engine so that physical changes correspond to near real-time virtual changes.
This system definitely has it's advantages when compared to other systems.
Most notably, and of concern, for many filmmakers, especially smaller budget productions, is that typically this system is less expensive than other similar options as it requires less hardware. Tracking points are placed on the floor, ceiling, or walls of your studio and don't require additional infrastructure, they also don't require every tracking point be the same distance from the centre of your stage. This makes installation easy and quickly adjustable depending on the needs of your production.
This also makes your setup more flexible, the stage is movable, the hardware is at the camera level, and the limitations do not exist - these conveniences do come at a small price though, that price being increased latency, a larger margin for error, and limitations in use-case. In this format the single sensor camera is responsible for identifying multiple tracking points and calculating the relative position of the camera, quicker camera movements might find some discrepancies in the positional data as the processing time and data capture can't match the complexity of the movement.
That being said, many of these issues can be avoided, thanks to the flexibility of the tracking points, with some smart adjustments to your studio layout.
Opposite to the previous method, outside-in motion tracking systems set up multiple sensors or cameras all around your defined stage area sending imperceptible lights throughout the studio identifying defined tracked objects as they move within the space. These systems typically centralize the combined information and feeds it into a real-time engine to drive live updates.
This system is a more rigid, semi-permanent solution, that offers increased reliability and power.
Multiple cameras feeding into a stronger CPU means that 3D calculations are less of a bottleneck minimizing latency. This setup can identify a single tracking point with a scalable amount of sensor cameras, meaning that positional data is more accurate as additional cameras see and provide new information. Additionally, this system allows the stage to serve multiple purposes as other tracking use-cases such as character motion capture can benefit from a similar setup.
This improved performance is weighed down by expensive upfront costs, additional rigging and support scaffolding, and studio size. Individual sensor cameras are pricey, and if your studio is larger than the range of the sensor you'll need more sensors to get similar results to inside-out systems. This system also needs to be considered when the stage is being designed as the cameras need a mounting structure and need to be positioned in a way that doesn't interfere with stage operation while still being able to see tracked objects on the stage without interference.
With enough preparation, and the resources this system is definitely viable for large-scale production. The improved performance has been favoured by many of the big productions using this technology. OptiTrack has been an industry leader for camera tracking using this setup, but smaller, more indie, alternatives such as the HTC Vive Pro system also adopts this setup.
Josh Goossen + Beth Mutch
Have you ever wanted to get drone shots but you find out it is illegal to fly your drone at a specific location? The solution is in Unreal Engine 5 AND using AI tools such as FlightDeck to get you the shots that you need for your projects.
Virtual Production is a fast growing world that is difficult to keep up with, especially if you don't know what's going on. We've compiled a list of twenty useful terms to help you either get started in virtual production, or keep yourself up to date.