To paraphrase Ansel Adams, we, as Directors of Photography or Cinematographers are 'techno-artists'. We must understand and embrace the technology so that we can use it wisely to create visual imagery, whether those images are created on 35mm film or on a digital chip. The artistry of lighting a movie has never been about the size or quantity of lighting instruments but about the ways they are used in telling the story visually.
I recently produced and photographed a movie entitled 'The 7th Lie' in France with a professional digital camera. The story and the actors were excellent and would have been considered excellent whether photographed on 35mm film or on a digital camera. I was looking forward to exploring the digital medium using digital acquisition equipment instead of on 35mm film for this movie. After testing several cameras, we settled on using the Sony Pal DSR500ws. This camera would give us the best quality digital image for the budget which could then be 'up rezzed' to 35mm for theatrical distribution in France.
I must say, the digital images captured were excellent as digital images but they are also quite limiting when comparing those images to a 35mm film image. Film images have a depth that no other medium can match at this time. There is the ability to isolate images when telling a story visually on 35mm film using a variety of photographic methods. Isolating an image with a long telephoto lens is quite difficult and relatively non-existent when using a digital camera due to the physics of the current professional 2/3" digital chip vs. a 35mm film frame. 35mm film also has a wider latitude [the acceptance of the lightest areas to the darkest areas of the frame] due to the film emulsion chemistry vs. the digital 2/3" chip. A digital image does not contain as much detail as a 35mm image due to the technical term, 'compression'.
Although the images are pleasant to look at when excellently lit, still a pattern on a garment would not necessarily show all the discreet detail or subtle shifts in coloration on the digital image; whereas on a film frame, due to its high resolution and the physical size of the frame, one would be able to see every subtle shift of design and color as the character moved through the scene. For example, this difference is very important when the actor shows a subtle shift in facial expressions which may not be able to be caught on a digital medium due to shadow and facial detail.
[...] When digital scenes are lit beautifully for the 'digital medium' in a manner consistent with making any well photographed movie, it takes approximately a similar amount of time and a similar amount of lighting units as when shooting 35mm film. When capturing images with a digital camera individual light units however must be more precisely controlled due to the way the digital video chip accepts light. By precisely controlling the scattering of light particles, the images created would be more dramatic in content and thus maintain a more 'filmic' quality which is what most 'digital filmmakers' thirst for.
I have now returned to Hollywood where we are running tests at several facilities including Eastman Kodak's Cinesite for 'up rezzing' the digital images to 35mm for theatrical release in France. This 'up rez' is performed at a considerable cost. Instead of the money being applied to the production budget, it is now being applied to the post production budget. After post production and the completion of the movie, it might be wise to look at the true cost savings. As we near the end of post production and review the cost vs. savings, we currently find that there is minimal savings of film vs. digital.
In this age of digital acquisition of information, as filmmakers, we have received this wonderful digital gift with delight. The digital camera is an excellent tool for acquiring complex imagery and will improve over time. Currently images developed on 35mm provide the audience with a more luxurious palette of visual information than digital imagery. This will, no doubt change in the future as the digital cameras improve, but the images created digitally will still need to be lit with artistry to help stimulate and thrill the audience. [Cinematographer Michael A. Hofstein, 2003]
On Stage 22 at the CBS lot in Studio City, a TV film crew is spending a seemingly ordinary day shooting flashback and insert segments for the Fox sitcom 'Titus'. Those segments will be rolled in two days later in front of an audience, during a live shoot of the entire episode, titled 'The Last Noelle'.
Jack Kenny, one of the show's executive producers doubling as director of this episode, is about to film a scene in which Titus' ex-girlfriend punches him in the face. 'A' camera operator John Dechene, however, asks Kenny to 'hold on', trots over to actress Danielle Weeks, and removes two tiny specks of lint from her black slacks. Shooting resumes.
'[Removing lint] is something I never would have thought about if we were shooting film,' Dechene explained later. 'But we're not shooting film - we're shooting [24p] HD. It was really just two tiny specks of dust, yet I saw them clearly, even though I'm looking at a flickering, black-and-white video image inside the viewing tube. The Sony [HDW-F900] camera picks up small contrasts, and I have to be aware of that fact when I'm shooting. In my entire career, I don't think I've ever noticed anything like that looking through ground glass [of a film camera lens]. But here, I noticed it on a tiny HD monitor, and if I could see it, then viewers would see it.' Such is life in the bold, new world of 24p, high-definition TV production.
'Titus' represents the best illustration to date of the highs and lows of 24p production. The show is believed to be the first multi-camera episodic sitcom to utilize Sony's 24p technology, in concert with first-generation Panavision Primo Digital 11:1 zoom lenses and Panavision's Ultraview, cine-style viewfinder. That viewfinder was designed by Panavision to mimic the ergonomics of a typical film camera, offering operators a higher viewing magnification rate than a film camera; but at the same time, limiting them to eyeballing a black-and-white video image while composing shots.
What makes the 'Titus' transition so important is that producers are using essentially the same film crew that shot last season on 35mm. That's because producers agreed to produce this season's 24 episodes under the IATSE film contract, rather than under the Guild's tape agreement, thus assuring the show would continue to employ a larger film-style crew. By changing the jobs of some crewmembers, adding additional crewmembers, and attempting to replicate 'traditional' production methods while using 24p technology on a multi-camera show, 'Titus' is traveling an uncharted and sometimes controversial path. In the process, every member of the crew has become a test case for what happens when a film-trained professional transitions to HD.
Veteran film and television DP Bobby Byrne runs the show's camera department, relying for the first time in his career on HD technology. Byrne calls this season 'a learning curve,' in which he has received a 'strong education.'
Among Byrne's challenges: light issues, depth of field issues, and the need to evaluate camera shots in an entirely new way - using a 24-inch, Sony BVM-D24E1WU HD monitor on-set, controlled by a hand-held, electronic switcher that lets him check all four cameras on the same monitor. Last season, Byrne simply looked through a traditional, film camera viewing tube to evaluate color, light, and composition. But even though Panavision supplies a cine-style viewing tube with the Sony cameras, Byrne and his cameramen are limited to two viewing options: either a flickering, black-and-white image on the tiny monitor inside the camera, or an attached, top-view, color LCD monitor. While those monitors can show great detail to a trained eye - such as specks of lint - they aren't sharp enough to satisfy a veteran DP's needs. Thus, the on-set HD monitor has essentially transformed into Byrne's 'viewfinder' as he sets about determining what shots are suitable.
'I always depended on my own eye to judge quality of light and other things,' says Byrne. 'I could use the quad-split [video assist] monitor to judge operating skill, but for color and light, I always used my eye, like most DPs. Here, I use a switcher to flip between each of the four cameras - we call them A, B, C, and X for Steadicam. That gives me a beautiful picture that replicates what I would see looking through a film camera. I've now gotten used to it, but it took a while. Shooting HD, the monitor has essentially become the best way to evaluate the quality of a shot. In that sense, it replaces both the traditional video assist combined with the viewfinder of a film camera - for the purposes of what a DP would normally use them for.'
That monitor, however, also happens to be Byrne's only line of defense against lens flares and other optical oddities that can pop up on a typical TV set. With his operators limited to using video monitors, they no longer can assist Byrne in detecting flares, as film cameramen often do with the naked eye.
'The [Panavision] lenses are very good, but they do accept flares a little more than our regular [film] zoom lenses,' says Byrne. 'We have to be very careful about back-lighting to prevent flares, but I'm the only one who can detect them on the HD monitor if they do appear. My operators can't see them on their small video monitors. Flares, therefore, have become totally my responsibility.'
Besides extra care with back-lighting, Byrne has also made other lighting changes on the 'Titus' set. 'We have less latitude with light than I would normally have with any of the Eastman Kodak film stocks I would ordinarily use for a sitcom,' he explains. 'In particular, we have to be careful in dealing with reds and whites, so we are a bit more muted with our lighting scheme. That's why this format would not be good for outdoor location shooting - it just blows out the whites. But on a controlled set, with four-cameras and a film crew, we can compensate for that. Just be careful on the white side, the hot side, and I think you will be OK, if you are shooting on a set.'
Byrne's other major challenge is keeping HD's superior depth-of-field capabilities from overwhelming foreground images. In that battle, he gets lots of help from the show's production design and set decoration departments, combined with enlarged camera aisles on the 'Titus' set and increased use of lens filters. 'These cameras can see everything, and it's hard to get soft backgrounds,' says Byrne. 'Everything is so sharp all the time. We have muted our backgrounds, using darker paints and lights. I also diffuse the lens more, using different filters, and usually shooting almost wide open. Widening the camera aisle to about 18 feet - it was about 12-14 feet last year - has also helped a lot, because that allows us to keep the cameras further away from the actors and make the foregrounds sharper, while still moving the cameras on dollies.'
Byrne adds that other HD issues make his job slightly more complicated this year. Among them: the need to work around dozens of cables that, last year, were not needed on the 'Titus' set, and the inability to casually roll out a film camera on non-filming days and use the viewfinder to plan shot composition. [Fragment from an article published on the 'millimeter-website, February 1, 2001.]
Cinematographers have long debated the merits of film vs. digital, but today, as they evaluate options for upcoming features, they're just as likely to ponder all the bewildering flavors of digital that are available.
Sony, Panavision, RED and others have cameras vying for market share in this high-stakes game. ARRI recently threw its hat in the ring with the ALEXA, which has been used on enough features to be assessed by d.p.'s. In a quick poll, it got raves.
"The camera is awesome," said Byron Shah, who used ALEXA to shoot Disney's 'Prom'. Shah and helmer Joe Nussbaum tested it against several other digital cameras. "Unequivocally it had the best images," Shah said. However, ALEXA was new and Shah was leery of being a pioneer. "There was trepidation at the studio," he said, but Disney digital guru Leon Silverman persuaded the producers to take a chance. Shah shot 'Prom' in L.A. last summer.
The first ALEXA feature was shot in March 2010 in Berlin, where d.p. Anna Foerster used it on Roland Emmerich's period movie 'Anonymous'. Foerster took advantage of the camera's extreme low-light sensitivity in scenes illuminated only by candlelight. "You could clearly see the [flickering] of candles on the eyes, faces and costumes," she said.
Caleb Deschanel has used ALEXA on two features: Timur Bekmambetov's 'Abraham Lincoln: Vampire Hunter' and William Friedkin's 'Killer Joe'.
"It's the first digital camera I deemed good enough to use without feeling I'm giving up too much compared to shooting film," he said. "It's very close to film in latitude."
"ALEXA is the only digital camera where I felt I had the same latitude as film," echoed d.p. Bobby Bukowski, who finished shooting Oren Moverman's 'Rampart' on ALEXA in December.
So sensitive was the camera that the d.p. often found himself using practical lights as the sole source of illumination, shifting some of the lighting responsibility to production designer David Wasco. "[David] became as instrumental in designing the lighting as I did," Bukowski said.
ALEXA isn't cheap: $75,000 without lens. It's also heavy and subject to the usual cabling that Bukowski describes as "the bane of the digital world."
ARRI addressed some of the limitations at this week's NAB confab, unveiling a modular version that separates the camera's front from the recording function - allowing shooting with a lighter unit in tight spaces and with 3-D rigs, per ARRI's U.S. topper Glenn Kennel.
Other companies aren't exactly standing still. At NAB, Sony showed footage shot with its upcoming F65 to wide acclaim. And film - the standard by which the ALEXA users interviewed here judged the ARRI camera - is still around.
"I don't know the place for 35mm film right now," Bukowski said. "But I do know that well-lit digital can look too sharp and brittle." His solution: to couple digital cameras with the "old glass" of film lenses.
In the international movie industry, 'digital intermediate' has grown into a buzzword for a wide range of technical processes associated with bringing a movie into the theatre. 'Digital intermediate' encompasses the full range of technical processes in which a moving image from a digital source, direct or scanned from film, is manipulated, matched with the proper sound in digital format, and prepared for viewing.
Viewing may be:
> By feeding a data-source into a projector as used in a digital cinema;
> In a regular cinema after image and sound have been rerecorded onto 35mm film;
> From DVD or tape after image and sound have been converted to a video signal.
A 'digital intermediate' is a process by which sections of, or the entirety of a motion picture is digitized through the use of a 35mm film scanner, into digital image files, manipulated in some manner, typically with color grading and digital special effects, and displayed or projected, either in a digital form, also known as digital cinema [D-Cinema], or recorded to film, using a laser film recorder, for traditional film projection.
|Digital Intermediate Process||Optical Film Process|
Optical Grading and Timing
In the last year or two, the price of scanning film and recording it back to celluloid has come down so much that it is now economically feasible to bring an entire show into the digital domain, work with it there and record it out, ready for printing in the laboratory. The term 'digital intermediate' has come to be used for this process, and though it is still so new that in some cases it is being redefined with each show, the fundamental creative and technical advantages may soon make it a standard part of post-production for many features.
What does the 'digital intermediate' process offer? Most important, it provides unprecedented control over film color. Once your show is digitized, all the sophisticated color correction tools that are standard in video become available for film. Dark, medium and bright parts of an image can be timed separately, contrast can be adjusted, color can be changed gradually within a shot, 'power windows' can alter specified areas within the frame, and secondary color correction, where each color can be tweaked individually, is available. All of this happens in real-time, and with random access to the entire show.
This degree of control is available for entire features, allowing the creation of a look that would otherwise be impossible or require relatively unpredictable custom processing. One of the first films to use 'digital intermediate' for stylistic purposes was 'O Brother, Where Art Thou?' by the Coen brothers [ph: Roger Deakins].
Another major advantage 'digital intermediate' offers is that color correction is done only once, and the resulting file is then used to produce film and all video versions. Directors, cinematographers and editors can control and approve in one place the timing of all versions of their show and take care of other tasks, such as pan and scan and letterboxing, at the same time.
Not only can a show be color corrected in ways not previously possible, but the fact that the entire show will be manipulated digitally extends the creative palette and gives production the freedom to use more visual effects. As a result, it might allow some shows to shoot faster and, for better or worse, with less discipline. For example, a period piece could shoot with modern airplanes or TV aerials visible, knowing that they could easily be removed later. 'Digital intermediate' also provides an elegant way to combine footage shot in different formats, be it film or video.
The process isn't free - traditional lab work is still cheaper for film delivery alone, and it takes more time than traditional film timing. But 'digital intermediate' combines many processes and many budget items. The combined line items for film and video timing, titles and bread-and-butter opticals [fades and dissolves] can typically pay for a full digital finish. And prices will inevitably come down.
Scanning film takes time, and time is money. The result is that filmmakers and vendors must make choices about how much data is scanned from each frame. This number, the scan resolution, influences the economics of the entire process. Scans are measured in thousands of pixels of horizontal resolution. One 'K' means 1,024 pixels. A full-aperture '4K' scan has 4,096 pixels horizontally, and 3,112 pixels vertically. 4K is the current gold standard, and it's intended to faithfully record every single detail of the underlying film. 2K scans are less expensive and more common: 2,048 x 1,556 pixels to each frame. They yield files that are only a quarter the size of 4K scans: about 13 MB vs. 52 MB per frame.
How do these resolutions compare to film? Theoretically, based on the grain structure of the emulsion, film could be pegged as high as 6K. Practically, however, this is only true for first-generation camera original and only under ideal conditions. In practice, negative film is typically assumed to have a maximum resolution of about 4K. Release prints from an inter-negative, depending who you talk to, are said to have a resolution of well under 1,800 pixels across, and the projected image may actually be worse because lamps can be misaligned, and lenses can be dirty or out of focus.
What is the right scan size for 'digital intermediate'? Purists say that 4K is the only way to go. But many people say that 2K is more than good enough for theatrical distribution, since it offers as much or more resolution than the film prints we're seeing now.
The simplest 'digital intermediate' process entails scanning a cut negative. The resulting file can be color corrected, titles can be added, and some effects work can be done. This is the process that has been used most often so far. It's straightforward and relatively economical. But it's tantalizing to consider the possibility of scanning uncut negative. This would theoretically allow a show to be re-cut in the workstation, without the limits that negative splices now put on the release process - you'd build the show digitally and simply film it out. But the more you scan, the more money you spend and the more potential for confusion you introduce.
Once the show is scanned and visual effects are incorporated, color correction can be done with several systems. Digital timing offers capabilities that film professionals have only dreamed about in the past, allowing the look of a show to be refined in ways that were formerly impossible by any means. According to Bruce Everett, co-producer of HBO's miniseries 'Band of Brothers' [ph: Remi Adefarasin & Joel Ransom], the stylized colors of the show were created entirely in post. After desaturating, 'squashing' and tinting the image, secondary color correction was used to get skin tones, muzzle flashes, etc. back to a more natural look.
One critical issue for color correction is fidelity from the timing environment to the final film print. Monitoring can be done on specially calibrated high-definition video projectors or on CRTs. The goal is to pre-visualize the final film look as accurately as possible, and no system is perfect. It is this area of color fidelity that has seen some major hiccups in the past. Cinematographer Conrad W. Hall, who used the 'digital intermediate' process for 'Panic Room', says that the colors and brightness of the projected image changed during the six weeks that he spent in color correction. But he adds that this and other bugs will be worked out as more people use the process and the technology matures. In his view, the creative freedom afforded by 'digital intermediate' was more than worth the trouble and time involved.
Once a show is color-corrected, it must be recorded out to film. Outputs can be made to inter-positive, from which inter-negatives are made in a traditional lab process, or each printing negative can be output individually. That's more expensive, but it produces better quality - every print is generationally closer to the original negative.
In an environment where video acquisition is starting to make inroads in feature production, 'digital intermediate' offers a new lease on life for celluloid, giving filmmakers many of the creative tools that their TV counterparts have used for years. At the same time, the process takes us one step closer to a full digital workflow where all circled takes are scanned and a cut show is built entirely in the digital domain. Though we're not quite there yet, as prices come down, some type of 'digital intermediate' process may soon seem like a creative no-brainer for any show that will be released on film.' [From an article by Rainer Standke in the Editors Guild Magazine, May-June 2002.]
After several years of expectation and anticipation, the D-Cinema [Digital Cinema] revolution is firmly underway, with thousands of screens now showing digital movies. The big screen revolution is happening, both in public cinemas and private homes, with dramatically improved image quality over the analogue technologies that have served the entertainment industry so well for so long.
D-Cinema refers to the use of digital technology to distribute and project motion pictures. The final movie can be distributed via hard drives, DVDs or satellite and projected using a digital projector instead of a conventional film projector. Digital projectors capable of 2K resolution began deploying in 2005, and since 2006, the pace has accelerated. HDTV and pre-recorded HD disks could put pressure on movie theaters to offer something to compete with the home HD experience.
The Digital Cinema Initiatives [DCI], created in March 2002, working in conjunction with members of the Society of Motion Picture and Television Engineers [SMPTE] standards committee, has published a system specification for digital cinema that was agreed upon by 7 major Hollywood studios: Disney, Fox, Paramount, Sony, Universal, Warner Bros. & Metro-Goldwyn-Mayer [withdrew in 2005]. In 2008, the DCI published a new version of the Digital Cinema System Specification [more than 100 pages]:
A number of significant technology developments have occurred in the past few years that have enabled the digital playback and display of feature films at a level of quality commensurate with that of 35mm film release prints. These technology developments include the introduction of: high-resolution film scanners, digital image compression, high-speed data networking and storage, and advanced digital projection. The combination of these digital technologies has allowed many impressive demonstrations of what is now called Digital Cinema. These demonstrations, however, have not incorporated all of the components necessary for a broad-based commercially viable Digital Cinema system. These demonstrations have created a great deal of discussion and confusion around defining the quality levels, system specifications, and the engineering standards necessary for implementing a comprehensive Digital Cinema system.
The primary purpose of DCI is to establish uniform specifications for Digital Cinema. The DCI member companies believe that the introduction of Digital Cinema has the potential for providing real benefits to theater audiences, theater owners, filmmakers and distributors. DCI was created with recognition that these benefits could not be fully realized without industry-wide specifications. All parties involved in the practice of Digital Cinema must be confident that their products and services are interoperable and compatible with the products and services of all industry participants. The DCI member companies further believe that Digital Cinema exhibition will significantly improve the movie-going experience for the public.
Digital cinema conforming to the DCI Standard is referred to within the film industry as D-Cinema while all other forms of digital cinema are referred to as E-Cinema. E-Cinema may be anything, ranging from a DVD player connected to a consumer projector to something that approaches the quality of D-Cinema without conforming to some of the standards. Even D-Cinema itself has evolved over time before the DCI standards were framed. However, the current DCI standards were made with the intention of standing the test of time, much like 35mm film which has evolved but still retained compatibility over a substantial part of a century.
Barco DP100 - 2k [2048x1080] projector for screens up to 20m [66ft]
Christie CP2000 - 2k [2048x1080] projector for screens up to 25m [82ft]
There are currently two types of projectors for digital cinema. Early DLP [Digital Light Processing] projectors, used primarily in the USA, used limited 1280x1024 resolution which are still widely used for pre-show advertising but not usually for feature presentations. The DCI specification for digital projectors calls for three levels of playback to be supported: 2K [2048x1080)] at 24 frames per second, 4K [4096x2160] at 24 frames per second, and 2K at 48 frames per second.
Three manufacturers have licensed the DLP technology [developed by Texas Instruments - at the heart of every DLP projection system is an optical semiconductor known as the DLP chip, which was invented by Dr. Larry Hornbeck in 1987 - first prototype projector was introduced in 1994]: Christie Digital Systems, Barco and NEC. Christie is the maker of the CP2000 line of 2K DCI-compliant Digital Cinema projectors, and long established in traditional film projector technology throughout the USA and is the market leader in terms of units sold and deployed internationally. While NEC is a relative newcomer to Digital Cinema, Christie is the main player in the USA and Barco takes the lead in Europe and Asia.
The other, soon-to-be-deployed-technology, is from Sony and is labeled 'SXRD' technology. Their projector provides 4096x2160 resolution.
Digital cinemas can also deliver live broadcasts from performances or events. For example, there are regular live broadcasts to movie theaters worldwide of Metropolitan Opera performances. [Using quotes from Wikipedia, the free encyclopedia.]