Classic telly FX tech: How the Tardis flew before the CGI era
How the special effects boys made magic in the 1960s, 70s and 80s
Enter Computers, stage left
CSO had some inherent limitations too. Before the advent of motion-controlled camera rigs, it was impossible to move the cameras photographing the shot’s two components in perfect harmony. Nudge one and the background, say, would move while the foreground didn’t. Many a nicely keyed shot was spoiled by this kind of disjointed movement. It meant that CSO really couldn’t be used for anything other than static shots.
Scene-Sync, used only once in Doctor Who, motion-controlled CSO cameras
It didn’t stop directors from trying, of course. A common technique was to film stock footage of smoke or rain in order to superimpose it on a previously recorded tracking shot. But while the background shot reflected the camera’s pan, the smoke or rain shot didn’t, rather giving the game away. Of course, you could make exactly that kind of mistake with the laborious optical printing process used by the movie business at the time, and many films did, but it seemed worse somehow on the small screen.
Eventually, Evershed Power Optics engineer Reg King came up with Scene Sync, a motion control system which could co-ordinate two cameras’ movements to allow CSO background and foreground shots to move in harmony. A motion detection rig fitted to the first camera picked up pan and tilt movements and relayed them by cable to a slave unit which controlled the second camera. With careful calibration, the system could scale down movements to match the different scale of the subjects being shot. A camera recording a background model at 1:10 scale had to be panned a tenth as far as the master camera moved in order for the movements to match up.
Scene Sync was first tried on Doctor Who during the recording of Season 18’s Meglos - which was also the only time it was used on the show. The production team allowed the story to be a guinea pig for the new technique, in exchange for which they got to use Scene-Sync for free. While the Meglos experiment provided valuable experimental data that would be used to refine the technique for other shows, including The Borgias and Jane at War, the Doctor Who production team found the process’ manual calibration too time consuming to rely upon it.
This was never going to look right
In any case, just as computers were being used to control camera movements - a trick that goes back to the mid-1970s when it was pioneered on the likes of Star Wars - they were also moving into the video effects arena. As we’ve seen, Newbury, Berkshire-based Quantel launched Paintbox in 1981, and it soon became easier and cheaper to do picture composition and adjustment in the digital domain.
Paintbox comprised custom hardware that could grab two video fields, digitise them into a frame and store it. A keyboard, a tablet and a stylus was all a trained operator needed to combine video sequences, adjust the colours and paint in new ones - hence the name. In 1982, Quantel introduced Mirage, a box which allowed frames to be replicated, scaled, distorted and bounced around the screen. TV opening titles would never be the same again. As Paintbox quickly defined the look of early 1980s pop videos, so Mirage defined how a decade of television was presented.
Doctor Who benefited in particular from Paintbox, not merely because it soon proved a more flexible, more efficient alternative to CSO, but it allowed those archetypal quarries-as-alien-landscape shots to be made to look even more extraterrestrial by colouring the rocks blue, painting the sky purple or dropping in shots of smoking volcanoes in the background. It also allowed director Lovett Bickford to pull Tom Baker to bits in The Leisure Hive.
Quantel’s Paintbox could easily be over-used...
Shots of the Cheetah People planet in 1989’s Survival, for instance, are way ahead of comparable ‘alien panorama’ effect shots from just three or four years previously. Survival was the last classic Doctor Who story. In the years that followed, Quantel kit was improved but eventually replaced by desktop computers running off-the-shelf photo editing software. Couple that with 3D modelling software and it became possible to assemble complex CGI shots, initially static ones, then panning shots and later, panning shots with small amounts of animated movement - smoke, birds and such - to convince the viewer he or she w actually seeing what they’re seeing.
With the basic principle established, it was just a case of waiting for ever more powerful hardware to render all this imagery more quickly, to allow the pictures to be much more detailed, or both. Today, Doctor Who looks far more impressive than it ever has because of these resources, and better than it did eight years ago when it returned to TV screens.
Of course, time and the budgets that determine how many person-hours are available continue to limit what effects shots can be achieved, which is why even now Doctor Who rarely goes OTT beyond a couple of episodes a series. But even a small number of carefully crafted tweaks can only do so much to sell a shot and convince the viewer he or she is looking at Second World War London, subterranean Silurian cities or rotating black holes at the edge of the universe.
...but it could also be applied to great effect to add moons and active volcanoes
Though it’s also nice to know - particularly given Doctor Who’s long and much-mocked tradition of wobbly sets and iffy FX - that production team’s reach still exceeds its grasp. Even today there are glitches, if you know where to look... ®
Sponsored: The Nuts and Bolts of Ransomware in 2016