Classic telly FX tech: How the Tardis flew before the CGI era
How the special effects boys made magic in the 1960s, 70s and 80s
Doctor Who @ 50 These days it’s all done with computers, of course.
CGI – short for Computer-Generated Images, or Imagery – was a well established visual effects technique long before Doctor Who was rebooted in 2005, so it was never in doubt that on-set mechanical effects would be duly combined with CGI visuals during post-production. Both the new series’ CGI and the picture compositing work was handled by The Mill until it closed its UK TV branch in April 2013.
A rare use of CGI in classic Who: Sylvester McCoy’s opening titles
TV production, composition and storage is now entirely digital, so computers are a necessary and inherent part of the production process. No so in the 1960s and 70s, during the classic series’ lifetime. Back then the final product was analog: two-inch Quad videotape masters made from edited videotaped studio footage and telecined 16mm film from location work and model shots
By the end of the Doctor Who’s initial run, computers were already being used in TV graphic design, model photography and video effects. Think, respectively, of Oliver Elmes’ title sequence for Sylvester McCoy’s Doctor, of the motion-control opening Time Lord space station model shot of the mammoth Trial of a Time Lord season, and of the various pink skies and blue rocks applied to extra terrestrial environments during the Colin Baker and McCoy eras.
Yet all these computer applications ultimate still resulted in analogue footage. A sequence shot on analogue videotape would be digitised, tweaked in a gadget like Quantel’s Paintbox rig, and then converted back into the analogue domain to be edited into the rest of the (also analogue) material. During the early 1980s (1983 in the case of Doctor Who) the BBC moved from two-inch Quad tapes to the more compact, more sophisticated one-inch C Format tape, but it was still analogue.
Want to pull the Doctor apart? You’ll be wanting Paintbox then
Paintbox was introduced in 1981 and didn’t achieve widespread use until the decade’s middle years. Than as now doing effects digitally was relatively easy – it’s just about altering or combining pixel colour values. If one of Image A’s pixels in the framestore has a certain RGB value, write the output pixel value from stored Image B instead. Creating a good “green screen” shot is a little more complicated than that, of course, but that is essentially the algorithm for, say, superimposing a shot of the Doctor onto Raxacoricofallapatorius or one of RTD’s other worlds with outlandish names.
Creating the same shot entirely with analog kit was something else, however. So how did Doctor Who’s special effects technicians make the Tardis and all those silver-sprayed washing up liquid bottle spacecraft seem to fly through the stars, and make the Doctor appear to be slugging it out with little green men on alien sands when yet another Buckinghamshire quarry shot simply would not do?
Back in the 1960s, even the basics of colour video signal manipulation were unavailable to the special effects boys. As tight as Doctor Who’s budget was, its designers and the model makers who turned blueprints into physical objects for photography were able to come up with incredibly detailed work.
Existing footage doesn’t do the Dalek city model work justice. Here it’s filmed alongside actors in a false-perspective shot
Looking back at early stories on DVD, most of which, though restored, still derive from low-quality film or video duplicates at many stages removed from the original footage, it can be hard to appreciate how good the original imagery was, though back then tellies used a mere 377 lines to display the picture.
The first Doctor Who stories, recorded in 1963, were transmitted using the System A format. Devised by EMI, System A streamed television pictures as sequential fields of alternating lines, two fields interlaced together forming a single frame. The moving picture was transmitted at a rate of 50 fields every second – so 25 frames a second – to harmonise it with the frequency of the mains electric current driving studio lighting. To have used a different frequency would have introduced strobing picture interference.
System A actually supported 405 scan lines in a frame, but only 377 were used for the picture, the rest being left in order to allow slow display circuitry to have caught up with the incoming signal back at the start of a new field. System A not only had 65 per cent of the vertical resolution of later standards, but it couldn’t do colour.
There’s some nice model work in The Space Pirates - but no stars
Video might be fine for capturing actors’ performances on brightly lit studio sets, but it wasn’t really any good for model work. Videotape made models look even more like models than they did anyway, as some of the model shots in Frontier in Space, Carnival of Monsters and, later, Full Circle demonstrate this flaw perfectly - so effects shots were filmed, allowing photographers to use more subtle lighting.
So effects shots were filmed, telecined to videotape and then edited into the master tape. Rockets, say, were mounted in front of a suitably starry backdrop, typically a back-lit black cloth with tiny circles cut where the photographer wanted the stars to appear. Sometimes the technicians didn’t even bother with stars. Quite a few of the later Patrick Troughton stories, most notably The Space Pirates, are full of ship models moving across the screen against a plain black background.
Sponsored: Global DDoS threat landscape report