S4 Studios is a one-stop-shop for virtual production and visual effects, with a studio in Los Angeles and a virtual production/ICVFX (in-camera visual effects) stage up the road in Canoga Park. Owner and creative director Geoffrey Kater is using the AJA GEN10 HD/SD/AES sync generator with the HTC VIVE Mars CamTrack and we recently caught up with him to talk about how his studio is using the technology to keep its LED walls, cameras, and computers in sync.
S4 Studios is a visual effects company with an ICVFX stage. We offer one location where clients can shoot in real-time, and because we’re a full-service visual effects studio, we can also do clean-up and create visuals they might not be able to on the wall. 24 years ago S4 Studios was started with a focus on animation and VFX. About two years ago, I began getting involved in virtual production and blending in real-time visual effects. We set up three walls and all kinds of virtual production equipment. At that time, Unreal Engine started taking hold. Because I’d worked in animation and VFX for so many years, I adopted the software and started to use that in our pipeline, not just for virtual production, but also for other types of animations and VFX work. When COVID hit, everybody started to work virtually, and we started working more with Unreal.
We wanted to do more virtual production and started looking for a stage. I came across the stages at Remmet Studios, where they had just started building an LED wall, and offered up my production team to develop the technology to run those walls and build on this new era of filmmaking – in exchange for allowing us to use the space as a home base.
When shooting, my role is creative director and virtual production supervisor. I work with the director and the DP to ensure they get the shots they want for the day. I also run the Unreal group that puts the environments on the wall; I direct them to change things around or move setups based on the type of shot we’re trying to do.
We have a smaller stage; it’s 2300 square feet, so we have a three-wall setup, which is great for in-vehicle processes. We've done music videos and other shoots at a significantly lower price than the giant ICVFX volumes. We want more companies to understand that virtual production is within reach, even for smaller projects. You don't have to spend $50- $60K a day on a volume. We start at $10K a day, and you save money in the long run because you can do multiple locations but not have to go anywhere.
Cost-efficiency is a big part of it, but the real-time art direction and ‘final pixel’ aspect of it is driving the trend. You come in, shoot, and leave with the final pixels. It’s about having 100% control of what’s on the wall in that virtual world to ensure you’re getting your desired look. You can also roll techniques you couldn’t in the real world because you have complete control over the set. You can't shoot on the beach and make the sun go up and down at will. A lot of producers who are becoming experts on the technique also understand the cost savings aspect of it because there’s only a little VFX or post work needed after. It also eliminates the need to go to any location, get permits, and block off city streets. You can shoot multiple locations in a single day without a company move.
A typical day is centered around testing different cameras, systems, and environments, along with creating our library of Unreal environments and marketing the business. We also team up with technology providers to showcase gear to potential clients. I also constantly work with line producers, invite them to come in and experience the tech or have a DP stop by to see how the tracking works and wrap their brain around the types of shots they can do on this stage.
The biggest challenge is that some clients have never shot on an LED wall, so getting them to understand shooting distance to the LED wall and pixel resolution is key. There can also be issues with moiré and camera issues that you don't have to consider on a regular shoot, like your shutter angle and genlock. Some folks come in with cameras that don't even have a genlock sync in.
The virtual production crew is also a very different component that’s now part of your shoot. We run into technical challenges throughout the day, so there has to be an understanding that we will work together tightly to get them the shots they want. We also ask clients to be patient with the fact that we might have a computer that starts acting funky or a wall that glitches. So, we have a wall tech who focuses specifically on the walls, the processing power, and those types of things. Another component is revealing to a client that we can move a building in the background and/or slightly change the lighting. It takes some time for them to wrap their heads around being able to use this as a creative tool in real-time and not just as a backdrop.
All of our computers have NVIDIA Quadro cards and sync cards in them so that, down the chain, they can all be synced together on the wall. Then, there are the actual LED processors, which project media onto the wall. Next is the tracking aspect, where the VIVE Mars camera tracking comes in. We hung the base stations up in a grid at about ten by ten meters, so it works perfectly for the size of our stage and the amount of shooting we do.
Once that was done, we started testing and noticed tearing in the image because there was no genlock, no sync. This issue came up frequently on the Mars Facebook Group. People started posting diagrams about how it worked, and there was a mention that the AJA GEN10 worked well with the Mars system, a sort of ‘plug it in and forget’ solution. The whole genlock thing is new to me, coming from visual effects, but I ordered it, and when it showed up, we configured the little dip switches on the back, and plugged everything in. The tearing, the lines were gone, and the walls, camera, computers, and Mars system were all synced at 24; it was beautiful. We were running around high-fiving each other.
We had found our solution with the GEN10, or the magic box as we call it. It’s the metronome, the heartbeat of the operation, because it brings everybody together on the same page. That little magic box is the conductor of it all. While only some opt to use genlock, I like knowing we are covered and all of our computer equipment, walls, and everything are in sync.
Being on the HTC VIVE Mars CamTrack Facebook Group is the best support we could ask for, and Tim Wen over there is always supportive. Their technology is also set-and-forget, so there is little to call tech support about. Everything's well documented, and they’re continuing to release features that make our life easier. On the AJA side, when we ran into an issue, our video engineer said, ‘Call AJA, and I guarantee someone will pick up the phone.’ I got someone right away, no hold music. We talked them through our challenge, and they began looking into it. AJA is the gold standard, that's for sure.
AI; it’s the biggest game changer right now. Rotoscoping and tracking are now a click of a button with AI. Having started my career as an animator and designer, then moved into computer animation, character animation, and visual effects, I've been through different paradigm shifts. The AI wave we’re experiencing now isn’t dissimilar to the move from 2D to 3D animation. Everyone thought 2D would go away, but there are still a lot of great 2D cartoons.
That’s how I look at AI movie-making tools. Just because a computer can be trained to make a beautiful background doesn't mean it has the cinematic eye. It's another tool that can help us do our job better. It affects turnaround and speeds up the time it takes to get something done that could be difficult. In the hands of someone like me, who knows how to track roto and has had to do all that rigging, it can help get me to where I need to go to do this job much faster.
I also see a lot of examples of what’s being done with AI to replace a person with a character and how great it looks. You can tell that it was shot on a really cool jib that a whole crew put together, not some guy on an iPhone. This is where the intersection of AI tech and physical production are going to come together into what we're working on. It’s exciting but also frightening to some. In the right hands, especially for creatives who understand where these AI tools came from, it's a huge boon for our industry.