Molecule VFX artist on how she uses Mocha Pro for clean-ups, screen burn-ins, and more only shows like Dickinson and Only Murders in the Building.
Oscars 2021: VFX on Judas And The Black Messiah
4 minute read
The team at Zoic Studios discusses what it was like to tackle invisible VFX on the Best Picture nominee and the importance of Boris FX plugins.
We recently caught up with Nate Overstrom, Zoic Studios' Creative Director/VFX Supervisor, to chat about the team’s visual effects work on Judas And The Black Messiah. The film was nominated for six Academy Awards including Best Picture and nabbed Oscars for Daniel Kaluuya (Best Supporting Actor) and H.E.R. (Best Original Song), among its 36 wins overall during this year’s awards season.
You’ve worked as a Creative Director/VFX Supervisor on high-profile projects like Homeland, The Man in the High Castle, Godfather of Harlem, and more. What led you to a career in visual effects? How did you get your start?
I’ve been an artist for as long as I can remember, drawing and painting since I was little. Growing up with incredible sci-fi movies, video games, and comic books in the 80s and 90s, it was never a question that I would do something where I would create characters and worlds. 3D CGI seemed to be the place where all that funneled together so I went to school for 3D animation. I ended up interning at a VFX studio after college, which turned into freelance and then a staff position. I migrated to compositing early on and never really looked back, though my 3D background really augmented my capabilities as a 2D artist. I really liked being able to put everything together and create the final shot. VFX is a unique place where sciences like math, physics, robotics, mechanics, anatomy, and biology blend together with photography, composition, drawing, painting, and general aesthetics. It is our job to distill all this to create compelling images. I can’t imagine a more fascinating job to come to every day.
You use Boris FX products in your pipeline. How important is it for you and your VFX partners to have access to these tools in your daily workflow?
Sapphire is a vast suite of plugins that just keeps getting better and better. They make it easy to layer and create complicated-looking effects with ease. Mocha Pro has been a staple tool for artists forever. Often Mocha is the only program to be able to track certain shots where full 3D solves fall apart, and traditional trackers have nothing to grab onto. We’re looking forward to exploring our capabilities in Silhouette with Mocha integration!
At what point did you enter the pipeline on Judas And The Black Messiah? What was the collaboration process like?
We entered the pipeline as a vendor early on. We worked with Jeremy Newmark (the VFX Supervisor for the client) from the initial script bid to production consultations all the way through final VFX reviews. We always view VFX primarily through the lens of storytelling. What is the shot about? What is the story of the scene? Sometimes the success of an effect is about subtlety. Jeremy was a great collaborative partner and let us lead with a lot of the creative ideas we had.
Viewers of the film might not necessarily think of it as a VFX-heavy film. How many and what kind of shots did you handle? What was the biggest VFX challenge?
We did about 75 shots for the film. We worked on a wide range of shots from cleanup and period fixes to blood hits and muzzle flashes, explosion enhancement, rain effects, and debris effects from the shootouts. Our biggest VFX sequence was the police shootout at the Black Panther Headquarters. The facade of the building went through a progressive change in damage across the scene and was viewed from multiple angles. Knowing that continuity would be the biggest challenge, we laid out the whole scene in 3D and mapped a color-coded “hit” diagram to map the progression of change. We balanced hits that happened on-screen with ones that happen off-screen so when we got to wider shots, the whole facade was equally damaged. We drove the majority of the debris and smoke effects with Nuke’s particle system tied to reveal mattes on the damaged matte paintings. The idea was that we would be able to address timing and placement changes and the additional effects would be updated on the fly. After the timing was signed off, we went in and addressed minor issues frame by frame.
Where do you see the VFX industry going in the next 10 years? What new technologies are you most excited about?
Both AI and real-time graphics are going to be big tools for projects. Not every show has millions of dollars for big-budget scenes. A lot of shows have to choose between doing cool shots they WANT to do, and doing shots they HAVE to do. If AI can help bring down the cost for laborious tasks, I think it is going to open the door for more creative work. The improvement of real-time graphics goes beyond the implementation of what we’ve all seen on The Mandalorian. Bringing Pre-Vis, Tech-Vis, and Post-Vis tasks into real-time gives directors and producers more tools to make better decisions either before shooting or before diving into larger VFX shots. Rendering final-pixel quality images in seconds or minutes (or real-time) instead of hours will allow for more iterations and changes made in 3D instead of compositing. With more content being created than ever before, we will need ever more powerful tools to work smart and efficiently, and keep up with increased demand.