Tanawat “Tor” Wattanachina has worked in visual effects since graduating in 2004. The Bangkok-based artist started as a compositor before spending ten years as a production generalist and team lead at a matchmove, rotomation, and VFX prep service for international clients. Now, he works primarily on local content in Thailand using SynthEyes on all his matchmove needs.
His work includes (credited and noncredited) Land of Tanabata (2024 Disney+), Suicide Squad (2016), Edge of Tomorrow (2014), 300: Rise of the Empire (2014), The Amazing Spider-man 1&2, and Ghost Rider: Spirit of Vengeance (2011).
What type of VFX tasks do you typically tackle?
People come to me for all sorts of matchmove work and problems they want to solve. Primarily, 80% of what I do is camera and object track work, and 20% is deformation or rotomation work.
Nowadays, I tackle more of a variety of tasks, not just matchmove, such as on-set VFX supervision, VFX data wrangler, and team management.
Can you break down your process as to how you approach a shot?
I don’t think my process is particularly more special than anybody else. I prefer to think of a shot as my new puzzle toy. You just have to figure out how best to approach it by looking at it from as many angles as time allows and coming up with a strategy. The first one might work right out of the batch, or it might not. That’s alright. You move on to the next strategy. That keeps me focused (and having fun) while solving the puzzle in front of me.
How did you first discover SynthEyes? How long have you been using it?
It was before I left the company around 2017. I was doing matchmove work using in-house software, where I also worked with artists and programmers to improve the software. I knew how it worked on the front end and back end. It could do very sophisticated and flexible setups for matchmove work. So when I left, I needed something equally excellent to work with that would allow me to continue my journey as a matchmove artist.
I was on the hunt and evaluated “industry standard” matchmove software. I didn’t quite like it. It was clunky (in my opinion) until I found out about SynthEyes. Its speed is unbeatable. It’s well-rounded. It can get you to the finish line in a very fast turnaround time. I’ve used it ever since.
How has having access to SynthEyes changed the way you work?
As I mentioned before, I had been using in-house software to tackle matchmove work for about a decade. It was flexible and sophisticated, BUT it was slow. It would easily take you minutes to solve a typical shot. It takes a blink of an eye with SynthEyes! I’ve never solved shots faster in my entire career. I can push out shots in a day like never before.
What SynthEyes features do you find most valuable — and why?
I would say “seed path” and “Axis locks.” They are easy to understand and use. A typical matchmove artist relies solely on a raw solution. Most of the time I “craft” it to make it work well with an artist using both features down the production line.
What is the most challenging shot you’ve ever tracked with SynthEyes?
It's hard to answer such a question after doing this for more than ten years.
Like I said before, a shot is a puzzle. I know what kind of shot or footage is hard to tackle in matchmove software alone or if it requires hand-animated camera work outside matchmove software and is better handled in the DCC app. You know when you hit a wall and need to walk around it.
What are your top 3 reasons matchmove artists should incorporate SynthEyes into their pipeline?
It gets the job done fast. It’s capable of many things you encounter in your matchmove career. It is VERY affordable.
What’s your best pro tip for artists new to SynthEyes?
Know the basics. Learn why it works and why it doesn’t. Do a lot of exercise shots. Matchmove is like a math problem. You need a lot of experience handling this kind of task. The more you do, the better you’ll tackle it later in your career.
What’s your favorite project you’ve ever worked on — and why?
My favorite project is Tron: Legacy. It was when my team and I didn’t know how to tackle a shot. It was a full-body actor matchmove. And back then, we only did camera / rigid object tracking.
And one day, both our team and the client agreed it would not work. We decided to animate the full-body CG character to match the actor. We used a camera + rigid object track as the base for our animator to do the full-body matchmove. It worked excellently! That’s when we all had our AHA moment. It’s not a one-size-fits-all solution. It’s about what works best on a particular problem.
That’s how we built our Rotomation department back in 2010 and continued to work on such a service for many other films, including Green Lantern, Ghost Rider, Men in Black III, and The Amazing Spider-Man 1&2. We continue the service to this day.