Original URL: https://www.theregister.com/2011/11/27/3d_hardware_future/

Moviemakers on a quest for their real-time 3D Holy Grail

Windows or Linux to build a film frame by frame, server by server?

By Chris Bidmead

Posted in Legal, 27th November 2011 11:00 GMT

The massive blockbuster Avatar reintroduced 3D to the 21st century.

The big difference from the previous 3D invasions was digital technology. Optically and physiologically the principle was the same: pairs of frames, representing a left eye view and a right eye view, are presented (near-) simultaneously to the viewer. The 3D picture then gets assembled by the human brain.

A shot from Avatar

Animation's come a long way from Toy Story to Avatar

But not, apparently, effortlessly. A study at California State University showed that watching a 3D movie tripled the risk of headaches and eyestrain among the 400 movie-goers taking part. The 3D cinema image is dark - light is lost through the 3D lens and through the polarised glasses. More fundamental is the "convergence/focus issue" discussed by Oscar-winning movie editor Walter Murch in Roger Ebert's Chicago Sun-Times column, uncompromisingly entitled "Why 3D doesn't work and never will. Case closed".

But Jim Mooney welcomes 3D if only because it doubles the number of frames his customers require him to render. He's the director of Render Nation, a managed video render farm in Liverpool's Innovation Park, founded in 2005 as one of very few at the time offering third-party rendering to the video industry.

There are several ways of arriving at the kind of 3D that doubles Mooney's business and Murch is so roundly condemning. Animated movies these days are typically created in a 3D world of textures, perspective and lighting, from which it's relatively simple to derive either a 2D or a 3D final print.

Real-world photography involving actors is a very different proposition. One way is to fake it. Shoot in conventional 2D, and then, as the movie industry saying goes, "fix it in post production". Dimensionalising is the industry's polite term for this. But doesn't it make for lousy 3D?

"Yes, dimensionalised movies have been slated," says Steve Prescott, group director of technology at Soho's post-production house, Framestore. "But it's usually because they've been rushed through post." Although some TVs purport to be able to dimensionalise conventional 2D on the fly, it's not something you fully entrust to software.

Prescott says it takes an outfit like Prime Focus, a huge post-production house with a 65,000-square-foot HQ in Mumbai, India. "They have thousands of people there, mapping out objects and projecting them onto models converting 2D to 3D," he said.

Preferably 3D is shot with two side-by-side cameras. "Using a stereo camera rig - an unwieldy thing that takes a lot of setting up," says Prescott. "Get the angle that the cameras are converging at wrong and the shoot can be unsalvageable. A bit of a shock for people who think everything can be fixed in post."

Andy Howard, Prescott's chief engineer at Framestore, confirms that 3D for theatrical presentation seems to be tailing off. The big switch has been about digital rather than about 3D.

"It's changing so fast," he says. "In February of this year, 70 per cent of the material delivered to us by clients for post production work was still being shot on 35mm film." By September the ratio had completely flipped: "Now 70 per cent is coming through from electronic cameras."

The workflow inside Framestore has always been digital, but now the need to run incoming 35mm film through a telesync device is fading. Another example of the receding tide of 3D leaving digital in its wake?

Maya have your attention, please: the tools of the trade

Howard points out that the term "3D" in production houses like Framestore most often refers to something rather different - the "3D in 2D" of animation movies, also used to insert 3D objects or characters into real-world 2D presentations. And that tide's by no means receding. Skills like these are very much in demand.

The majority of what Howard calls "traditional 3D work" at Framestore uses the software package Maya, with some XSI mixed in. Both owned by AutoDesk, these suites of software tools are used for creating computer-generated characters and objects, often in conjunction with conventionally filmed 2D. Even the scenery. "Look at the Harry Potter movies," says Prescott. "Hogwarts only goes up to about 7 foot. All the environments above that are generated."

Tools like these do more than just help animators draw the characters and scenery. XSI, for example, also known as SoftImage after the Avid subsidiary it was purchased from in 2008, has built in "inverse kinematics". This technology solves complicated equations behind the scenes to calculate, for example, the precise changes of angle at the thigh, the knee and the ankle to walk a character in a life-like way from A to B.

Towards the end of the last century work like this needed specialist workstations. Today, says Howard, "we'll go out and buy an HP or a Dell workstation, dual or quad core machine." The graphics card is important: in the last couple of years a lot of the computation has been pushed onto the GPU. AutoDesk recommends nVidia cards. "You can use a competitors' hardware," says Howard, "but AutoDesk don't guarantee it."

HP ProLiant BL460

A meaty stack of HP ProLiant BL460 servers

Over in California, Ryan Granard, senior director of digital operations at Dreamworks Animation, tells a similar story on a larger scale. His outfit, best known for its Shrek franchise, has just delivered its latest feature-length animation movie, Puss in Boots, a work pipeline stretching back over more than five years. He's running an IT shop that's primarily HP, from workstations to servers, through the LAN and right out to the WAN. "Our artists - around 200 of them on Puss in Boots - are exclusively on HP Z800 workstations," he says.

"They're doing the same thing animators did 50 years ago, but on electronic drawing tablets - Cintiqs. They use electronic pens that work just as if they were drawing with ink, pen and pencil."

Once drawn, the images need to be animated, textured and lit, something the 8- or 12-core Z800s are powerful enough to do on a per frame basis without external help. But the assembly into the final film as released to distribution needs to be "farmed out".

"When it comes to rendering," says Granard, "that goes to very, very large render farms made up of thousands and thousands of servers doing batch processing." Built around HP ProLiant BL460 blade technology, there are five of these farms, geographically dispersed across the United States and India.

Effectively it's a cloud operation. As a final movie like Puss In Boot amounts to around 120TB of data, that's a lot of bits to ship around. "We use HP networking," says Granard, "and it's very fast and very robust. We run at 10Gbit at core and 1GBit out to the desktop. We get out to our cloud services through 10GBit pipes with lots of redundancy."

Windows, Linux, what works best for the studios?

During the film's creation the artists need to send off the sequences they're working on for rendering to check their results. It's typically an overnight process, like the old days of "daily rushes". The ideal would be instantaneous on-site rendering, but Granard explains: "We're generating so much visual complexity that we're still not at a point where we can get every single artist the sort of horsepower they need to do everything interactively."

The sequences are built up in layers: first outline sketches, then the creation in software of wireframe sculptures called "armatures". Once these armatures have defined the basic motions of the characters their surfaces are textured. They're then used to create a rough layout, or "animatic" around which camera movement can be set up. Many layers of refinement follow, matching the action and lip movement to the prerecorded voice track, blocking out the way the character moves through the sequence, and adding the subtle touches that bring the character to life.

Jim Mooney admires the scale of bigger shops like Framestore and giants like Dreamworks Animation. His more modest Render Nation runs on 120 servers.

"The packages we support are the ones more appealing to freelancers rather than large studios," says Mooney. "Most of the incoming material we process here is done in 3DS Max."

It's another Autodesk product with a long history that grew up through the Microsoft ecosystem from the early days of DOS. He adds: "All our machines are running the 64-bit versions of the various Windows operating systems."

Mooney says that for a shop his size, Windows is relatively cheap and easy to maintain. Framestore's chief engineer Andy Howard has gone a different route: "If we'd had to do all this on Windows, we couldn't afford the licences. We just couldn't make any money – we'd have had to shut down ages ago."

Licensing costs matter

Framestore runs almost exclusively on Linux, supplemented by a handful of Macs. Both operating systems are essentially Unix, which Howard finds helps a lot with maintenance. As Howard tells it: "Back in the late '90s the whole movie industry did the same calculations we did. If we're going to have to buy a new Windows licence for each workstation every year and a half, and we've got 500 operators - do the maths."

But it's not just about the licensing. "Linux is absolutely revolutionary," says Howard. "And the great advantage is that it's open. It allows you to script things, compile programs for it, which is where the individual facility companies get their edge. They write their own shaders, or fur programs or whatever, and compile and run them on Linux."

Dreamworks Animation took the same view when it was spun off from DreamWorks SKG in 2000. "We're primarily a Red Hat Linux shop," says Ryan Granard. "That was a decision our CTO Ed Leonard made years ago. As a high-performance computing shop we have a lot of fairly stringent requirements, and Linux has really held up extremely well for us."

Back in the mid-90s Microsoft had attempted to take over the 3D animation market with its purchase of the Canadian Softimage corporation, whose software had been used by Industrial Light & Magic to create the dinosaurs in the Oscar-winning Jurassic Park. But while Microsoft was porting the product to Windows NT, Disney released Toy Story, the first feature length computer animated film.

Buzz Lightyear

Buzz was rendered on Pixar's own software

Toy Story was something quite new to Hollywood, and the technology behind it was totally disruptive. Developed by Pixar, a once-failed video hardware vendor revived as a computer animation house under the leadership of Steve Jobs, the software tools used to string together the more than 100,000 frames had been created in-house, and ran not on Windows but a variant of Unix.

Microsoft acknowledged defeat and sold off Softimage in 1998. By the early 2000s, Pixar, like most other animation outfits, had converted to Linux. Although Pixar's home-grown software tools, packaged under the name Renderman were ported to Windows and Mac OS X, Linux had by this time become the standard animation platform. Long-term Windows developer AutoDesk saw the writing on the wall, and in 2005 bought the cross-platform packages Maya and XSI.

So Linux rules, and scales nicely up as processor cores proliferate. But there are never enough of them. "In terms of processor power," says Granard, "one of the most expensive things we do is lighting the shot. You can imagine how frustrating that must be for artists, to create a sequence and then have to send it off overnight, and hope that it comes back looking the way you wanted it." Making intensive rendering like that near-instantaneous is what Granard calls "the Holy Grail". In a recent experiment he caught a glimpse of it.

"We pooled some of our HP workstations, and a few of the most powerful blades we have in our data centre, approximately 800 processor cores, and gave them to a single artist using one of our lighting tools. What would normally take overnight became pretty much an interactive session. I can't afford to give 800 processor cores to every artist. But we hit the biggest inflection point, which was proving we could do it."

Best of luck with that. Granard is up against "Blinn’s Law", first formulated by the NASA computer graphics pioneer Jim Blinn: "As computers get faster, the frames they're rendering become increasingly complex and nuanced. The average time to render an animation frame is a universal constant." ®