Electronic Cinema Content from the guide to life, the universe and everything

Electronic Cinema

1 Conversation

Electronic cinema is something that - like a cure for cancer, or an off-switch for children – should have been invented a long time ago. After all, cinematography itself was invented back in 1889 by William Friese-Green, and television thirty-odd years afterwards by John Logie Baird. Combining the two techniques, so that moving colour images could be acquired electronically and projected onto a cinema screen without any tedious in-between messing-about with celluloid, ought to have been a doddle. Yet most cinemas in the world still rely upon image generation technology that essentially hasn't changed for a century. So - why is that, and what is the problem with the way things currently work?

The Need for Change

It costs a lot to distribute a 'print' to a cinema - over £1,500 - and considerably more to gear up for producing it in the first place. Distribution costs accounted for 36% of UK box office revenue in 2003 alone. What cinemagoers get to see tends to be governed by whoever has the financial clout to underwrite the production and distribution costs. And the costs don't stop there; after a while the print degrades, getting scratched and faded, and has to be thrown away. Moreover, at the sharp end of filmmaking, the processing costs of colour film can prove prohibitive to small independent filmmakers. The net result is that filmmaking and distribution remain the almost exclusive preserve of the big studios who can marshal the capital needed in this extremely risky business area.

Many people still prefer to see films at the cinema instead of hiring out DVDs because this offers the experience of 'total immersion'. It is also more of a social occasion when shared with friends or even other cinemagoers. Unfortunately, for the very reasons outlined above, the choice is limited by economics rather than by artistic merit.

Electronic cinema, by sharply reducing production and distribution costs, promises to provide an outlet for the creative talents of the smaller independent filmmaker1. Films could be copied to and shipped on digital cartridges, each of which costs about £100 to produce and ship out, then copied to the projector's hard disk drive. The cinema would not have to return the cartridge, unlike a film print. Films could even be downloaded from satellite links or the Internet, and shown within a matter of minutes. However, some considerable technical hurdles have had to be cleared for this vision to become reality.

The Challenges

Arguably, the lowest hurdle to clear is at the very beginning of a film production's life cycle. Television cameras have been around in one form or another for a long time now, and coupled with digital image recording, they make a compelling alternative to photographic film. It is very simple to shoot a feature film on high-definition video using an electronic camera; just ask George Lucas. Star Wars: Episode II - Attack of the Clones was shot directly onto digital video using a high-definition camera developed by Sony and Panavision. The production then shied at the last hurdle, displaying the electronic image on the big screen. The production was transferred to conventional 35mm film before distribution in all save a few specially-equipped cinemas. The costs speak for themselves; $16,000 for 220 hours of digital tape, compared to $1.8M for 220 hours of film! Digital tape can be reviewed as soon as it is recorded; film has to be processed.

So Far So Good, But...

It's the last hurdle that is the highest; displaying a feature film on a cinema screen electronically has been incredibly difficult, up until recently. The conventional cinema projector, which generates a stream of moving images by interposing celluloid between the lens and a light source, is essentially a 'light valve.' It controls the output of light from the lamp over a period of time. Originally, electronic projection techniques took another approach; they combined the light source and the image generation. The first video projectors used separate red, blue and green cathode ray tubes (CRTs) and combined the images on the screen to give full colour rendition. CRTs have one very significant advantage over most other technologies; they can be run at virtually any resolution and hence have the potential to display very detailed images. Two other advantages are that CRTs are easy (and therefore cheap) to make and also have a long life before they need replacing.

This is where the advantages end, however. They need frequent image realignment, the light output is dim and the colours tend to be washed out. Moreover, the colour spread from a single CRT leads to chromatic aberration, and hence to the blurring of the final image. CRT projectors won't handle a screen bigger than 100 inches, which is fine only if you happen to be showing obscure art house films with a potential audience in single figures. The light-valve-based approach, on the other hand, is superficially far more attractive: it merely requires that the light source be good at producing white light and the valve an efficient modulator.

The Russians Get in on The Act

This didn't stop some people from trying to improve CRT technology. It was a Russian, Vladimir Zwyorkin, who invented electronic television in the first place, and his legacy lived on in St Petersburg. For a number of years, the Lebedev Institute had been working with a radical new version of the CRT, the quantoscope.

In a conventional CRT, the faceplate is coated with a phosphor powder that emits light when an electron beam strikes it. The light gets scattered and goes everywhere (which is why you can view a television from lots of angles). This is fine when the family is huddled around Doctor Who in the living room, but not so good when the intention is to get as much light as possible out of a lens onto a cinema screen. Also, phosphors tend to emit a wide range of wavelengths centred on their main colour output. The result is that colours tend to be less intense than that obtained from film.

What the Lebedev scientists did was to grow single crystals of II-VI semiconductors2. They then cut and polished them into flat disks, deposited a metal mirror on either face and fitted them as faceplates to CRTs. When the electron beam struck the semiconductor crystal, it caused it to emit light in much the same way as a phosphor, and the two mirrors trapped the light in such a way as to induce laser action in the crystal. Lasers emit very bright and very pure light, and the quantoscope can therefore generate very bright and very pure images, over 1,000 times as bright as a conventional CRT projector.

So why don't we see these projectors in use today? The reason is that the quantoscope is a temperamental beast that requires very careful handling. Firstly, it doesn't work until it is cooled down to liquid nitrogen temperatures (77K). Secondly, the tube lifetime tends to be about 2,000 hours - after this, the electron beam destroys the metal mirror. Thirdly, the image tends to be of poor quality due to imperfections in the crystal leading to an uneven light output. One of these issues would be sufficient to preclude its use from commercial cinema; all three of them rule it out as a viable alternative to celluloid, for now.

The Big Breakthrough

As a result, other technologists looked at the other alternative, which was to create an electronic light valve. Light valves can either be transmissive or reflective, but most of the early attempts used the former approach. One of these had a laser write an image to a continuous loop of photochromic film: the film retained the image for a short time, but long enough to project through it onto a screen. When liquid crystal displays became available, projectors were made which used these to generate the image by passing light through them. However, LCD projectors suffer from high cost, fragility, poor response times, low contrast ratios, unreliability and poor resolution, none of which endears them to even domestic viewers, let alone cinema owners and cinephiles.

The Hughes-JVC ILA projector, which evolved in the early 1980s from flight simulator displays, tried to address some of the issues with liquid crystal projectors that relied upon transmissive light-valves. It was nevertheless an awkward chimera of CRT and liquid crystal display. The CRT wrote an image onto a silicon wafer photosensor, which then polarised a liquid crystal layer in much the same way that a photocopier drum transfers the image to paper. Light was then reflected from the liquid crystal to generate the image. Three of these light valves, red, green and blue, made up the projector. It provided extremely good image quality, having almost infinite resolution, but was very difficult to maintain from day-to-day as it reportedly had a tendency to drift out of alignment. It was also cumbersome and very expensive: a cinematic 'white elephant'. Other subsequent refinements addressed the liquid crystals with electrodes directly placed on the wafer, but have since been superseded by the following invention.

The Digital Micromirror Device

It wasn't until 1988 that the first major development in light valve technology occurred. Texas Instruments developed the digital micromirror device3. This, like the Hughes-JVC projector, is a reflective light valve; light bounces off the DMD and is projected onto the cinema screen. As the light source can be as bright as one likes, and the device reflects virtually all of it, very large and bright images can be generated.

The DMD is a marvel of micromechanical engineering. The surface of the DMD is an aluminised silicon layer, etched into millions of tiny square mirrors - each less than one-fifth the width of a human hair - supported at two opposite corners by torsion hinges. Underneath this layer sits an array of electrodes controlled by a network of tiny transistors. Each mirror has an electrode situated under each of the opposing free corners. Switching an electrode causes the mirror to tilt its corner toward it, so mirrors can be individually tilted to as to reflect light in one of two directions; through a lens, or away from it. The mirrors can be switched many times a second, allowing generation of grey-scale light intensities. DMD's are robust, reliable, and have high contrast ratios.

The first DMD projectors appeared in the mid-1990s and were low-resolution devices (800 x 600 pixels) designed mainly for corporate presentations. These used a single DMD chip combined with a colour wheel to generate full-colour images. Cinematic-quality DMD projectors use three DMD chips, each reflecting a single primary colour and capable of generating up to 35 trillion colours when operating together. Each chip in the three-chip projectors typically has about two million micro-mirrors.

Originally, DMD projectors were very expensive; the first cinema projectors typically cost upwards of £200,000. However, as the technology has become more mainstream, then prices have correspondingly fallen to the point where it becomes possible for an independent cinema to 'go digital.' It's not just cinema-owners and studios that are waking up to the potential of this technology. 209 cinemas across the UK were recently granted £11.7million of National Lottery money – about £56,000 each – to install digital projectors, allowing participating cinemas to show more independent films - about two screenings per day on average. Moreover, in Ireland, all cinemas will have digital projectors installed over the coming year (from the date of writing). Instead of reels of celluloid being delivered, movies will arrive on cheap hard disks, securely encrypted, and decodable only by the cinema staff. Hollywood's stranglehold on our cinema viewing habits will weaken as more and more cinemas rediscover the freedom to show independent films at an affordable cost.

The Changing Landscape

The remaining challenges facing electronic cinema are image quality and costs. Digitally generated images, although almost immeasurably improved upon when compared to the first CRT projectors, still can't achieve the ultra-high resolution of celluloid. About 2000 lines horizontal resolution is needed to surpass the quality of 35mm film. It's therefore highly unlikely that we'll see IMAX projectors, with an equivalent resolution of 30 million pixels, being replaced by DMD devices in the near future. Contrast ratios and colour intensities also still lag behind those of traditional film. Traditional film cameras and projectors are also a lot cheaper than their digital equivalents. Celluloid also has a characteristic image texture and subtlety that in the hands of an experienced director adds to, rather than subtracts from, the qualities of a film.

In fact, the only projector technology that looks capable of beating film purely on a quality basis is that based upon quantoscopes, as they have infinite resolution, colour saturation and contrast ratios - theoretically, anyway, as these devices have never been demonstrated outside of the laboratory. However, some encouraging recent developments in this field, where the lasing crystal has been replaced by a thin film phosphor sandwiched between dielectric mirror stacks, allows production of low-cost, high-intensity CRT tubes that have a long working life and do not require specialised cooling. This might not just allow digital technology to supplant film altogether, but could also make home cinema affordable to the point that nobody needs to go to the cinema anymore. Whatever eventual impact this technology has upon the cinema, it's likely to be the biggest since 'talkies' first were shown - for better or for worse.

1Anybody wanting to try their hand at digital filmmaking can now sign up for a course at Cinema City, Norwich. You get the chance to see your own digital short displayed on the big screen.2Semiconductors made from a divalent metal (such as zinc) and an element from Group VI of the Periodic Table. Typical II-VI semiconductors are CdS, ZnS, ZnSe, ZnO, etc.3This was originally developed as a light modulator for laser printers, and very nearly got canned due to budget cuts at TI.

Bookmark on your Personal Space


Edited Entry

A4596474

Infinite Improbability Drive

Infinite Improbability Drive

Read a random Edited Entry

Categorised In:


Edited by

h2g2 Editors

Write an Entry

"The Hitchhiker's Guide to the Galaxy is a wholly remarkable book. It has been compiled and recompiled many times and under many different editorships. It contains contributions from countless numbers of travellers and researchers."

Write an entry
Read more