On the eve of SIGGRAPH, Allegorithmic announced its Substance Automation Toolkit, a set of Python-driven APIs designed to allow facilities to automate their texturing workflow from the command line. It’s a sign of the times, as Allegorithmic evolves its Substance line of 3D texture creation software to meet the needs of filmmakers who it says are increasingly adopting real-time production techniques originally pioneered in the game industry. Those techniques are coming in increasingly handy for previs, VFX and animation, and VR production.

“It’s a two-way trend,” Allegorithmic EVP Alexis Khouri told StudioDaily. “Our tools are becoming more and more geared toward film and VFX, and on the other end, the film industry is moving toward more real-time tools. I believe we’re going to meet halfway.” Some of the projects that have used Substance include the feature films Logan and Assassin’s Creed and the Amazon series The Man in the High Castle.

Khouri noted that real-time has become a bigger part of film pipelines, especially with a new breed of children’s cartoons that are rendered in real time using a game engine. At the same time, as VFX houses gear up to take on VR projects, they’re hiring new personnel with experience in the games industry who can use their existing skills on relatively low-budget VR experiences. And feature filmmakers are relying more and more on real-time previs, especially as the quality of real-time renders edges closer and closer to the look of the finished product.

“We are adding features step by step, especially in Substance Painter, to deal with this,” said Khouri, citing continuing work to incorporate features like LUTs and OpenColorIO support. Another important technique is the ability to work on very large assets, a requirement for feature films, with techniques like UDIM that are basically unknown in videogame pipelines. “At SIGGRAPH, we are presenting steps in this direction, such as the ability to have a much lower loading time when switching from one material to another, and also the ability to share effects between different layers so you don’t have to do a lot of copy-and-paste operations between different layers when applying specific effects.”

The Substance Automation Toolkit is a decisive move in that direction. “It’s a way to automatically texture a humongous amount of assets almost automatically, driving Substance Designer and Substance Painter by code instead of manually, inside the Substance toolset,” Khouri explained. “Let’s say I have a database of materials, including woods, marbles and leathers. I also have a bunch of assets with materials that need to be applied, and I know exactly where. I can write a script that will go through Substance Designer and automatically apply this material to the correct area on a thousand different assets.”

The Animus can be glimpsed at 0:40 and 1:34 in this trailer.

As a practical example, Khouri cited Double Negative, where Lead TD Generalist Marc Austin used a “batch baking” methodology to apply Substances to different materials in the mechanical Animus, a CG asset that was central to the story of Assassin’s Creed. “That’s old news, but we’re seeing more and more traction for this technique,” Khouri said. “We’ve been working with several companies, including Double Negative, to get feedback to help us create this package. When you’re talking about texturing and retexturing hundreds of thousands of assets automatically, these facilities are working under so many time constraints. It can take a few months to produce a certain number of shots. It’s extremely intense. If they can save even a few hours per day, this is a big deal for them.”

And Allegorithmic CEO Sébastien Deguy reiterated that the trend cuts both ways, with tools like Unreal Engine and Unity adding editing and color-management features. “If you have the visual quality you need [in a real-time game engine], there is no reason not to stay there rather than redoing the work and waiting for hours for it to render,” Deguy said. “There are things you cannot do with a real-time renderer, or not do easily. But in Rogue One, the robot K-2SO was rendered directly in Unreal Engine 4 and, it turns out, Neil Blomkamp is working with Unity on something that’s apparently very interesting.  So I can understand why this is happening. The difference in rendering quality is small and getting smaller, and you don’t have to redo the work all over again.”

Allegorithmic: www.allegorithmic.com