Blog | Boris FX

Driving Virtual Production with SynthEyes 3D Tracking

Written by Jessie Electa Petrov | Sep 30, 2025 3:10:21 PM

Alex Pearce is a creative technologist, innovative filmmaker, and leader in the XR movement. He was the first real user of Assimilate Live FX and is one of the top Live FX operators. He’s also a virtual production supervisor, writes the Creative Technology for Film newsletter, and recently founded Sim-Plates, a library of CG video plates using Octane Render engine, which renders 16k, fully path-traced 360 videos for virtual production car process workflows. 

Pearce has been a SynthEyes user since 2015, when he was first introduced to the advanced 3D camera solve toolset while working as a cinematographer/director at Jaunt VR. “One of our vendors used SynthEyes to do a full 3D solve and then stabilize/add 3D models to a shot — and it blew my mind,” says Pearce. “That was when I really started getting into 3D and VFX in general.” 

Alex Pearce shares virtual production and car plate insights

His leap to virtual production didn’t fully materialize until COVID hit (although he had pitched it a few times) while working for a company doing VR training for United Airlines. The pandemic necessitated innovative techniques to deliver projects. The team relied on real-time green screen virtual production to safely shoot a scene where actors were supposed to sit around a table talking to each other. Shortly after, he joined Light Sail VR to build virtual production pipelines. He has since worked on virtual production for music videos, interviews, tradeshow demos, and, more recently, on TV series and the upcoming feature film Lear Rex, starring Al Pacino, Jessica Chastain, and Peter Dinklage.

360 Driving Scenes

Pearce specializes in car-process workflows, which he believes are an often-overlooked element in virtual production. Driving footage needs to be as stable as possible to blend seamlessly into a project’s world. If it’s not stable, the content moves when the car is not moving. 

“If you think about it, that’s like if the world was moving and shaking around you instead of the car shaking and moving. This is what makes it look like bad rear projection,” states Pearce. “I still see it on TV all the time. Even if the company providing the plates has stabilized their footage, you should check the footage to see if you can further stabilize it, regardless of whether it’s 2D or 360. Your monitor may look ok, but once you project it on a massive LED wall, every little shake is very obvious.”

“Tracking a driving scene in full 360 is extremely challenging. I’ve tried all the major tracking solutions. SynthEyes is the best/fastest out there,” Pearce notes. “SynthEyes is a very, very deep program that can do many different things, but you’ll likely need it for only a few specific tasks. It’s best to watch tutorials and find what you need, and then make your own documentation, because, likely, after you write it out, you’ll realize it’s actually quite simple, just a few steps. Still, it can look intimidating seeing all the options.”

Virtual production set

A Workflow for The Future

Pearce teamed up with SynthEyes and Assimilate product teams to help develop a workflow that allows virtual productions to solve potential issues like stabilizing shaky driving plates much faster.

The 360 track is completed using SynthEyes' speedy 3D tracking methods, but instead of sending the data to applications like Blender or Nuke first, it gets sent directly to Assimilate Live FX via a new feature introduced in SynthEyes 2025. The Assimilate 360VR Stabilization exporter eliminates multiple steps, saving a few hours per shot. This new exporter streamlines the workflow by sending yaw, pitch, and roll data directly to Assimilate Live FX, reducing render times and enabling real-time adjustments on both 360 and multi-camera projects.

“Doing a full 360 track in SynthEyes and then exporting straight to Assimilate for playback on LED walls has been very cool,” says Pearce. “We normally use really high-resolution media, so the workflow looks something like this.”

  • Import 16384 x 8192 EXR sequence into Live FX
  • Transcode to 16k NotchLC for playback (if it wasn’t delivered this way)
  • QC the notch file
  • Transcode a 4k EXR sequence specifically for SynthEyes
  • Track in SynthEyes
  • Export using SynthEyes Assimilate 360VR Stabilization Exporter
  • Import into Assimilate Live FX
  • Ready to play! 

In addition to using SynthEyes to track and stabilize car plates, Pearce plans to add 3D objects to scenes soon using the same workflow. “This will be a very good way of enhancing traditionally captured plates,” comments Pearce. “You’re getting the best of both worlds — photoreal sky and environment with some customization like futuristic elements.”

Experimenting with virtual production

The Emerging Role of VP Supervisor

With his breadth of experience, Pearce has been tapped as a virtual production supervisor on many projects and uses the opportunities to educate those around him. The role requires technical and creative know-how from the stage technology (LED wall, LED processors, media server, to the content) to the basics of color management, cinematography, video engineering, and IT infrastructure. They also need to be able to communicate effectively, or in some cases translate, between the filmmakers on set to the technicians and operators at the stage.

Pearce feels strongly that TV shows need virtual production supervisors for the same reason they need a showrunner or digital imagining technician. “If you’re working on a television series, you might shoot at different stages or times,” remarks Pearce. “The look should be consistent from show to show. Furthermore, the stage virtual production supervisor may do things one way, but the show may have a different requirement.”

“I’ll give you one good example of why this is important. Netflix has put out a few standards for virtual production. One of them is that the wall/content should be HDR (High Dynamic Range),” continues Pearce. “Even if a show is not for Netflix, I highly encourage everyone to follow these standards. They are great guidelines.” 

“Many times recently, I’ve been hired as a VP supervisor on stages that state they use a specific workflow (that was not HDR). Since I understand how to set it all up properly, I was able to work with the stage and get them set up to do HDR properly,” adds Pearce. “If I hadn’t been there, they would’ve shot in SDR (Standard Dynamic Range), and besides not looking as good, it wouldn’t have been up to the standards.”

Sims-Plates Examples

“I try not to have any industry secrets,” ends Pearce. “I understand why many people and companies want to keep this knowledge to themselves, but in general, I don’t think it will negatively impact me much and might save someone else weeks of troubleshooting if I can just share things I’ve already figured out. I am a big believer in the ‘rising tides lift all boats’ theory.”