The terrifying tech behind this summer's zombie assault
And how 1GB should get you a poke in the eye from Winslet
Real life... only better
The infected scale the walls in World War Z Photo by MPC/Paramount Pictures © 2013 Paramount Pictures.
Rajat Roy, the global technical supervisor at Prime Focus World, explains further:
“When people get to the theatre they expect it to look like the world around us now…like real life. Whereas of course it’s not. It’s not supposed to be that, even. It can’t be. Because when you’re looking at a thing you’re automatically converging your focus on each thing that you’re looking at.
In the cinema you’re in a controlled fantasy environment where you’re actually looking at a physical screen that’s a certain distance from you. And whatever I’m tricking your brain to think you’re looking at by putting pictures on that screen creates a dichotomy between what is physically happening to you and what we’re showing you.
There are things that I can do to your brain that will hurt you, that are bad for you. If you can look around the image and see those things, they’re the things we’re trying to cut out.
Those artefacts are prevalent in stereo shooting, and they’re prevalent in stereo CG. That’s one of the things that I think are not well understood currently. 3D is not supposed to look like "real life"; it’s supposed to serve the purposes of a story. And where the 3D image is, where the focus is, is supposed to serve the story.
The kit used to achieve these impressive tricks is comparatively unsophisticated. A mixture of Dell and HP boxes, either Quad Core or Dual Quad Core. Most with 24GB of RAM and an Nvidia Quadro 4000 graphics card. Fancy, but not exactly otherworldly.
Big data - literally
But the sheer quantity of data involved is dizzying. In the case of Prime Focus, there is an office in London and one in Mumbai, sharing data as required. Each digitised frame of film is over 12MB in size. Given that there are 24 such frames every second, you’re looking at an astronomical storage requirement: 288MB per second, 17.3GB per minute. Or 1.6TB for a 90-minute movie. And that’s just for the finished article.
Hollywood being Hollywood, studios insist on seeing all the options. The managing director of Prime Focus’s software-development subsidiary View-D™, Matthew Bristow, gives us a sense of how much data we’re talking about: Every shot will go through multiple iterations. One shot could have between five and 25 versions.
Bristow says: "Given that there are on average about 2,000 shots for a film, we could be looking at up to 50,000 shots. What you don’t want to be doing is pumping around all the data for every single shot so a copy of the plates will sit in every facility.
Instead, when we make a change, we don’t send through the entire shot, we send through a file that enables us to render the shot here – that minimises the amount of data traffic."
A still from World War Z: imagine stringing 2,000 of these shots together at 100MB a pop... This is a reletively tiny 566KB morsel. Photo by: Jaap Buitendijk. © 2013 Paramount Pictures
Prime Focus’s creative director, Richard Baker, adds:
What we’ve also done, in the last year or so, is rather than preview every shot in DPX we will, up to a certain level, view changes as JPEGs… they’re like JPEG2000s which come in at about 4MB per frame, as opposed to a DPX that will be more like 12MB. The way we have things set up you should be able to open up a script in India or London and you should be able to see all the assets and render out a version.
A simple frame without too many separate elements can be rendered in as little in a minute, but more complex scenes can take the software up to 25 minutes to shade and blend.
The team were reluctant to say exactly how long it might take to convert a whole movie - partly because its one of those jobs that’s never finished. As in movie post-production generally, tweaking and polishing keeps happening until the project’s due in cinemas.
However, Tony Bradley at Prime Focus told us:
With all of the View-D render farm working, we use about 3.5 TeraFlops of calculation power (for London. India is roughly the same so all farms working is approx 10TFlops. Now this is a per second figure and over the course of a project we could use anywhere between 1000 and 1500 hours of this power to get a show out. This is the equivalent of a single core machine running continuously for approx 1.8 million hours or about 205 years.
Sponsored: DevOps and continuous delivery