The future of cinema and TV: It’s game over for the hi-res hype
All you've ever been
told sold about moving pictures is wrong
Another normal feature of eye tracking is that if the object of interest is held still on the retina in real life, everything else gets smoothly blurred. But in artificial moving images this doesn’t happen. Instead everything but the tracked object seems to jump across the field of view at the frame rate. This is called ‘background strobing’, or just ‘strobing’.
Most of the grammar of cinematography is about avoiding strobing. One approach is to use long shutter times so that motion is smeared and the strobing is harder to see. Then you have to use tracking shots to keep your subject sharp so it is only the background that moves. Cinema also uses physically huge film frames or sensors, and huge lenses. By throwing the background out of focus the strobing is diminished. In cinema everything is controlled.
Panavision Genesis One digital camera system: note the big knob for focus pulling
Click for PDF brochure
In television, there is often little control and no budget for a focus-puller. That is why they have to use higher picture rates. That will probably remain true in the future. Rates will have to go up in both cinema and television, but the disparity will remain.
One of the reasons most home movies are rubbish is that the owner of the miniature consumer camcorder doesn’t want to carry a tripod that weighs more than his camera and seemingly prefers having the image going all over the screen. The best accessory you can buy for a consumer camcorder is a cement block that adds some mass to keep it stable.
Eye tracking causes interlace to fail in television. The two fields that make a frame are presented at different times so to a moving eye the odd and even lines are never going to fit back together, and they don’t, except for marketing purposes. That HDTV programme you are watching with 1080 lines is interlaced; there are only 540 lines in each field and that’s the effective number of lines if there is any motion, but oddly enough the larger number is always the one that’s publicised.
Making pictures: the joys of interlaced scanning
A TV standard having only 720 lines, but where every line is present in every frame - so called ‘progressive scan’ - can easily outperform a system that acts as if it has only 540 lines. When the US was choosing its HDTV standards this was well known, and the standards body was informed formally by the likes of MIT, the US Military and the entire US computer industry that interlace was a dead duck. Such is the power of vested interests and steamrollering tactics that interlace was retained as a US standard for HDTV, even though it can’t display high definition except when the picture is essentially static.
You can only pull a trick like that once, and the growing body of knowledge about moving imaging has ensured that interlace rightly won’t be considered in any future standards.
That body of knowledge came about in a variety of places. Manufacturers of TV sets were interested in getting the best possible pictures out of what came from the broadcasters by increasing the display frame rate at the set. For sports programmes, there was demand for good slow motion replays. Increasing international sales of television programmes required high-performance standards convertors to get from 50 to 60fps or vice versa.
Finally, there was work on compression standards so that digital moving pictures could be delivered with lower bit rates. These diverse requirements all discovered the same thing: dealing with eye tracking is the dominant factor. The solution is called ‘motion compensation’.
An example of a motion compensated process. This might be a slow motion process displaying video at quarter normal speed, or it might be an upmarket TV increasing the display rate to get rid of flicker. Note that the interpolated frames have to respect the motion otherwise the viewer sees judder and loss of resolution.
The principle of motion compensation is to identify the axis along which the tracking eye would watch a moving object. This is called the optic flow axis and it’s not parallel to the time axis. Suppose you want a slow motion replay at one quarter the original speed.
Between the pictures you have, it is necessary to interpolate, or calculate, three new ones. The key is that to get smooth motion, you can’t interpolate entire pictures; you have to interpolate the motion. As Figure 2 shows, every object that moves between the two input pictures has to be displayed with one quarter, one half and three quarters of the overall motion in the intermediate pictures.
Sponsored: Benefits from the lessons learned in HPC