Film in the Context of Digital Media

Scott McQuire. Handbook of Film Studies. Editor: James Donald & Michael Renov. Sage Publications. 2008.

In 1997, as the extended celebrations of the first centenary of cinema were wind­ing down, I undertook a research project examining the impact of digital technology on Australian film production (see McQuire, 1997). One of the most striking features of the several dozen interviews I conducted that year was that everyone I talked to—from leading directors, cinematographers, editors and those involved in various aspects of post­production, to producers and film educators—was unanimous that future cinema would no longer depend on film. While the time-frame for such a transition varied, no one doubted the trajectory towards cinema without celluloid. A decade on, much has changed, but film is still with us. Recalling this is not to suggest that the prognosis about filmless cinema was wrong, but to acknowledge that the transition to digital cinema has proved far more complex than was initially imagined.

Digitizing cinema has demanded the rein­vention of basic aspects of film production, while simultaneously altering the experience of watching film, both in terms of what audiences see and hear, and how it might be delivered. It has also demanded a fundamental re-examination of the way the film industry functions as a business. The fact that the transition to digital cinema has occurred as part of a broader agenda of globalization manifested in accelerated transnational flows of cultural content and heightened conflict over intellectual property multiplies the dimensions at play. In the digital era of convergent media, film content is increasingly imbricated not only with television, video and DVD, but the Internet, video games and mobile devices. Add to this mix the way that online databases and new forms of storage and distribution are altering access to films as objects of both fan culture and scholarly analysis, and the full scope of current change becomes more apparent. Traditional Film Studies questions concerning authorship, narrative, style and subjectivity have increasingly been supplemented by analyses of the exigencies of transnational markets and global franchises, the logistics of new delivery systems and the politics of ‘piracy’, the emergence of new cultural forms and the promises of interactivity. Is it any wonder that a common lament within Film Studies over the last decade has been the ‘disappearance of the object’?

From its inception, the film industry has been a site of constant technological innovation. Nevertheless, looking beyond the undoubted hype around what Vincent Mosco (2004) aptly calls the digital sublime, there is something distinctive in the magnitude of change affecting the film industry in the present. Many innovations in film technology, such as changes in film stock, lighting or lenses, have been incremental. Others, such as the introduction of colour film or 16mm cameras had a broader impact on film style and aesthetics. Prior to digitization, the most far- reaching technological transformation of the film industry was arguably the introduction of synchronized sound, which cut across various industry sectors including production and exhibition, producing a markedly different narrative style and audience experience, while short-circuiting the incipient interna­tionalism of silent film in favour of the dominance of linguistically-based national cinemas. However, the impact of the digital threshold is demonstrably wider than all previous technological shifts. It not only affects all sectors of the industry simulta­neously, from production through distribu­tion to exhibition, but the broader context established by digital convergence means that the boundaries of film—its status as a distinctive cultural entity—is up for grabs in a new way.

In the following I want to examine the influences of digital technology on key changes occurring in the film industry. I begin with an overview of the way film production and post-production were digitized, and then discuss current moves in distribution and exhibition. I then focus on issues of secu­rity, intellectual property and piracy, before turning to debates over aesthetics and film scholarship in the digital era. Running through all these issues is the emergence of a new economic structure for the film industry in the digital era.

Production

Prior to the 1980s, computers were largely ignored by the film industry. A few indepen­dent filmmakers such as Malcolm Le Grice (2001) conducted experiments in ‘computer film art’. The mainstream uses were the relatively few images that fitted narrative demands (the ‘Death Star’ simulation in Star Wars [George Lucas, US, 1977], the ship’s computer screen in Alien [Ridley Scott, UK, 1979]), and title sequences (beginning with Superman [Richard Donner, UK, 1978]) constructed by what was then called computer graphics. Nor were the first experiments in integrating computer-generated images (CGI) with live action such as TRON (Steven Lisberger, US/Taiwan, 1982) or The Last Starfighter (Nick Castle, US, 1984) particu­larly promising. As Brad Fisher put it, ‘The problem was that digital technology was both comparatively slow and prohibitively expensive. In fact, workstations capable of performing at film resolution were driven by Cray supercomputers’ (1993b: 52; see also 1993a).

However, even as the initial obituaries were being written, changes were underway which completely altered the situation. Exponential increases in computing speed coupled to decreases in the cost of processing not only launched the personal computer revolution of the mid-1980s, but put digital cinema on an entirely different footing. A second wave of CGI was signalled by Terminator 2: Judgement Day (James Cameron, France/US, 1991), which made morphing a household word. Two years later, the runaway box- office success of Jurassic Park (Steven Spielberg, US, 1993), followed by Pixar’s breakthrough digital animated feature Toy Story (John Lasseter, US, 1994), changed the question from whether computers could be effectively used in filmmaking to how soon this would occur. The subsequent decade, with its array of CGI-driven blockbusters led by the billion-dollars-plus worldwide theatrical gross of James Cameron’s Titanic (US, 1997), seems to have confirmed the shift.

Nevertheless, it would be a mistake to assume a linear trajectory, or, for that matter, a neat break from analogue to digital processes. While CGI gained the most attention from critics and audiences, the transition to digital technology happened rather earlier and more rapidly in areas such as sound production and picture editing. Digital recording and processing of sound took off in the early 1980s, building directly on the radical changes to film sound initiated by the introduc­tion of Dolby Stereo in 1975. Ease of sonic manipulation in the digital domain facilitated the growing complexity of film soundtracks. Rick Altman noted the way that the ‘spatialization’ of discrete sound elements in the cinema auditorium altered audience experience:

Whereas Thirties film practice fostered unconscious visual and psychological spectator identification with characters who appear as a perfect amalgam of image and sound, the Eighties ushered in a new kind of visceral identification, dependent on the sound system’s overt ability, through bone-rattling bass and unexpected surround effects, to cause spectators to vibrate-quite literally-with the entire narrative space. It is thus no longer the eyes, the ears and the brain that alone initiate identification and maintain contact with a sonic source; instead, it is the whole body that establishes a relationship, marching to the beat of a different woofer. Where sound was once hidden behind the image in order to allow more complete identification with the image, now the sound source is flaunted, fostering a separate sonic identification contesting the limited rational draw of the image and its characters. (1995)

The use of sound to produce a more visceral experience was soon complemented by changes in the image. One aspect was the shift to digital picture editing which became widespread following the introduction of the Mac-based Avid Media Composer Series in 1988. Although it had been possible to finish films on video since the early 1970s, the advantage of video’s range of cheap, instantly viewable visual effects was purchased only at the cost of reducing editing to a linear tape-based process. But once digital non­linear systems had sufficient image quality to enable editors to obtain accurate lip- synch, digital editing blossomed. By 1994, the switchover for feature film production was near complete. In a paper presented at the 1994 SMPTE (Society of Motion Picture and Tele­vision Engineers) Conference, Dominic Case from leading film-services provider Atlab argued:

Non-linear editing has been adopted in the film and video industry faster than any comparable innovation, and I believe that it has had a greater effect on production and post-production methods even than the introduction of video. (1994: 50)

The reason for the rapid acceptance of the digital pathway in picture editing echoed the experience with sound: by reducing the mate­rial labour and the potential degradation of source material in the editing process, digital systems enabled significantly greater creative experimentation. As leading Australian editor Nicholas Beauman commented during his edit of Oscar and Lucinda (Gillian Armstrong, US/Australia/UK, 1997):

When you do a cut on film, it requires you to think about how you are actually going to construct a scene a lot more carefully than if you are cutting on a digital, non-linear system. I can throw something together on non-linear very quickly and then look at it and think ‘well, no, that’s not right’. I can just make a copy of that, or I can put that cut aside do another one and another one. And I can show them all to somebody else. You can’t do that on film. Because of the time constraints, you have to think about it very carefully and say ‘OK, I am going to go down that road, I think this is the way to go with this scene’. If it doesn’t work, you have to peel all those splices apart again and start all over, and very few films can afford to do that. (1997)

Digital workstations also enabled easier utilization of images from different sources. Directors such as Oliver Stone in Natural Born Killers (US, 1994) and Baz Luhrmann in Romeo + Juliet (US, 1996) soon incorporated a wide range of formats in feature film, including video Hi-8 and Super-8. While both Natural Born Killers and Romeo + Juliet were particularly complex edits, they belonged to a time when the pacing of films was accelerating. David Bordwell notes that the average shot-length in mainstream US films during the decades from 1930 to 1960 varied between eight to eleven seconds, while the average number of shots comprising a film over the same period was 300 to 700 (2006: 121-3). In contrast, films of the last decade regularly include 2,000-3,000 shots with average shot-lengths of two to four seconds, with the fastest paced films dipping below the two second threshold. The experienced Luhrmann editor Jill Bilcock concurs with Bordwell that this shift is conditioned by digital editing (2006: 155):

Romeo + Juliet was such a big post-production job it wouldn’t have been able to be done in the time on film. You can quickly see those results on the non-linear system, where if you were doing it on film, you would probably have to put in a request for extra money to get inter-pos and create a film optical. You just wouldn’t get it done. It takes a lot more time to do … You wouldn’t cut like that on film. If I was cutting on film, I would be looking for two frames off the floor, I’d be scrounging around. It just wouldn’t have that creative freedom. So, creatively, it’s really good. (1997)

Post-Production

All these changes were in progress prior to the headlines made by Steven Spielberg’s ‘digital dinosaurs’ in 1993. Despite containing only about five to six minutes of CGI, Jurassic Park changed the industry’s thinking about the use of digital technology in film. The ensuing decade saw rapid experimentation with digital effects, as successive blockbusters undertook pioneering R&D in their attempts to show audiences things they had never seen before. By 1996 Cameron could claim, ‘anything is possible if you throw enough money at it, or enough time’ (quoted in Parisi, 1996a). Nevertheless, CGI still remains one technique among a range of others, including stunts, models, animatronics, and motion control. The computer has not replaced other pathways so much as redefined their use.

Arguably the biggest effect the flexibility and speed of digital technology has had on post-production is the marked move away from the traditional linear process of film production, comprising relatively dis­tinct phases of pre-production, shooting and post-production. Increasingly, these phases overlap. Pre-visualization involves planning not only the logistics of set construction and camera placement, but the relationship between live action cinematography and computer-generated effects, while sound and picture editing are often proceeding in tan­dem with principal photography. This shift to parallel processing represents less the disappearance of ‘post-production’ than its dispersal across all production phases. George Lucas highlighted this change during the making of Star Wars Episodes I-III (US, 1999; 2002; 2005):

I have been writing it [the next Star Wars episode] for two years, but I’ve also been shooting and editing, exploring different kinds of actors for different kinds of parts, and shooting and figuring it out. It’s not done sequentially at all. (quoted in Kelly and Parisi, 1997)

Pre-visualization has taken on an increasingly important role in the contemporary production process. While classical directors such as Alfred Hitchcock were fond of pre-planning scenes using model sets, digital technology has enabled far greater precision. Special effects supervisor Peter Doyle notes:

In the major films now no-one will even attempt to make a film without doing pre-visualisation. The complexity with which that happens depends on the budget, and the type of director. If you look at a James Cameron or John McTiernan, who did the Last Action Hero and all the Die Hard films, for the big action sequences they will actually make the entire scene in a computer with little stick men and CGI animation and full camera moves and the whole bit, and that is what’s signed off on. If that’s signed off, then the DOPs and the art directors and everyone else is then brought in, and they can then dial up what they need to know. So the camera men will know, ‘well this move is now on a 50 mm lens and I need to have lights here and I need to green screen here’. So that really changes the dynamics of the film. (1997)

Even though the majority of feature films are still shot on 35mm film, hybrid cameras provide a video split, enabling directors to instantly review shot coverage, as well as a data-stream that allows indicative sound and picture edits. Because of the increasing ten­dency for post-production to overlap with pro­duction, close communication between direc­tors, producers and post-production houses assumes a premium. Different tasks need to be more tightly coordinated. Renowned sound mixer Roger Savage notes: ‘Now, there’s much more collaboration. There has to be … It’s all running in parallel. So you have to not only communicate, but you have to be technically compatible’ (1997).

As well as traditional horizontal editing, involving the sequential linking of images, films increasingly involve the ‘vertical edit­ing’ of digital compositing, as individual picture elements are added or subtracted. Andrew Lesnie, who went on to win an Oscar in 2002 for his cinematography on The Lord of the Rings: The Fellowship of the Ring (Peter Jackson, New Zealand/US, 2001), comments:

I think it hasn’t changed the priority of my job which is to rationalise a project’s intent onto an image. So the technology is basically just another tool. You have a camera, lenses, filters, you’ve got the lab, all of which provides you with a certain scope, and you have digital technology which lam more than happy to use because it offers you all sorts of exciting possibilities. I think that probably the biggest thing to bear in mind is that you could get a little lazy about it by saying that it can fix things. (1997)

As Lesnie warns, the increased flexibility of the digital pathway does not always lead to efficient production. The contemporary tendency towards multi-camera coverage and the multiplication of ‘takes’ to ensure options for fine-tuning films during editing can lead critics conditioned by the standard Hollywood practice of producer control over the final cut, such as Bordwell (2006), to complain of sloppy shooting practices. But flexibility also offers new creative possibilities. It has often been noted that digital texts are radically ‘open’, insofar as digital tools lower the threshold for remixing and re-editing. The culture of the technological ‘mash-up’ favours the proliferation of director’s cuts on DVDs replete with out-takes and extra scenes, not to mention ‘updates’ of the sort that Lucas applied to the first three Star Wars films when he wanted to ‘fix’ their special effects for a re-released ‘digital edition’. As the existence of different versions of the same film becomes increasingly commonplace, film enters more fully into the remix culture characteristic of contemporary music or hybrid forms such as machinima. In the longer term, this is likely to further undermine the unifying fiction of the singular and bounded text, with its correlate of a lodestone of stable meaning. Even in a post-structural interpretative milieu, with its emphasis on polysemy and situated audiences rather than the sovereignty of authorial intention, an article of faith is that audiences are seeing the same text. Perhaps this assumption was always suspect: the contingency of dodgy projectors, different protocols of censorship and other forms of rough handling means that films—particularly ‘classic films’—have often been seen in quite different versions by different audiences. But such differences pale in comparison to Steven Soderburgh’s proposal for remixing current releases:

I’d like to do multiple versions of the same film. I often do very radical cuts of my own films just to experiment, shake things up and see if anything comes of it. I think it would be really interesting to have a movie out in release and then, just a few weeks later say, ‘Here’s version 2.0, recut, rescored’. The other version is still there-people can see either or both. (quoted in Jardin, 2005)

Distribution and Exhibition

Following the rapid transformation of film production in the 1990s, digital distribution and exhibition were the next steps in the seemingly irresistible march towards cinema without celluloid. In 1999 Lucas instigated the equipping of four US cinemas with digital projection systems for the premiere of Star Wars: Episode I—The Phantom Menace, and soon announced plans to capture and show Episode II—Attack of the Clones with a high-definition digital camera and to screen it only in digital format. Ambitious initiatives to fund the roll-out of digital projection systems across the United States were announced at ShoWest 2001 by Technicolor Digital Cinema (formed by film services group Technicolor and digital communications company Qual­comm), while new players such as aerospace giant Boeing expressed interest in entering satellite-based distribution. Advocates listed numerous benefits of digitizing distribution and exhibition: Lucas extolled the improved image quality for audiences over the whole of a film’s run, while distributors focused on potentially substantial savings in distribution costs, and exhibitors were promised a range of rewards from increased programming flexibil­ity to the generation of new revenue streams. However, in common with the experience of the production sector, the transition to digital distribution and exhibition has been significantly slower and proven far more complex than was first imagined. Although Episode II was captured on high-resolution digital cameras, the bulk of screens utilized film prints.

One of the key issues affecting the transition to digital cinema has been the question of common standards of equip­ment and software, from projectors and servers to encryption systems. Speaking in 2002 Julian Levin (Executive Vice-President Digital Exhibition and Special Projects at Twentieth Century Fox) noted:

[S]uddenly in the US about a year ago, we had 50 or 60 systems, three different compression tech­nologies required, at least two forced distribution channels through Boeing and Technicolor, having to provide your content in three or four different forms, and being forced to go through certain players at an expense that exceeds film to get the digital content to destination. (2002)

While proprietary systems appealed to those trying to gain a Microsoft-like grip on digital cinema, they made less sense to an industry accustomed to the universality of its product. Unlike television and video, 35mm film can be shot, processed and projected on standard equipment made by different manufacturers all around the world. In order to protect this existent interoperability, the seven major Hollywood studios formed the Digital Cinema Initiatives (DCI) consortium chaired by Fox’s Levin in March 2002. According to DCI CEO Chuck Goldwater:

Digital cinema was beginning to develop in different, not compatible directions, depending on who was the manufacturer, or the integrator of the system. Theatre owners had been through the digital stereo development issue, which resulted in three different and not compatible digital stereo systems and nobody wanted to see that happen again. This was an opportunity for the studios to bring focus to what they believe were objectives that everybody would find desirable and that is to ultimately develop a system with specifications and standards that were consistent and uniform in what they would produce. (2002)

While the process took significantly longer than expected, DCI finally released its spec­ifications for system architecture in August 2005. However, having a common set of technical standards for manufacturers to adopt has not resolved all the issues. Alongside standards, the biggest question mark around digital exhibition has been its cost. High- definition digital projectors that comply with the DCI standard are significantly more expensive than 35mm projectors. While 2K resolution digital projectors cost between $70,000 and $100,000, a new 35mm projector might cost $20,000, or less if second hand. Film projectors that are well maintained can have a life span of decades, while digital projectors are likely to follow the pattern of computer equipment and need regular upgrades.

This question is complicated by the estab­lished industry structure. While the bulk of savings will accrue to distributors, who will no longer have to pay for the production and shipping of bulky film reels, the cost of re-equipping theatres would generally fall on exhibitors. The expectation, in the US at least, has been that the two parties would eventually arrive at a commercially negotiated cost-sharing arrangement. However, despite a rise in the number of digital screens in 2005, US exhibitors remain cautious. Michael Karagosian (2006), digital cinema consultant to the National Organization of Theatre Owners in the US, points to continuing exhibitor concerns about cost, content security and the need for a certification programme to give exhibitors greater confidence in the technology. Karagosian argues that the low take-up of high-end digital systems (less than 1 per cent of world screens) reflects the ‘chasm’ currently preventing ‘early adopters’ from coalescing into an ‘early majority’.

The cost premium of the DCI-mandated standard does reflect the fact that it was designed to replicate or exceed 35mm pro­jection, and thereby protect the theatrical experience from the emerging challenge of ‘home cinema’ technologies. However, the fact that the high-resolution systems mandated by DCI exceeded most of those already in operation around the world raises the real possibility that different regions may well adopt different technical standards. This is partly a question of fitting the technology to the context. In rural China, for instance, low-resolution digital projectors have been installed in settings which previously had no cinemas. Here the need to compete with the audio-visual quality of 35mm film does not arise. Similarly, many art house cinemas and non-traditional venues are utilizing low- resolution digital projectors to screen niche films in boutique spaces. The question of the cost of digital projection has to be balanced against potential savings and the creation of additional revenue streams, even in territories such as Australia, where mainstream screens are dominated by Hollywood products and where the quality of the theatrical experience is critical. Here considerable uncertainty still reigns.

Most observers agree that digital cin­ema will offer exhibitors advantages in terms of more flexible programming options. Exhibitors will be able to expand the number of screens devoted to a popular film without having to wait for extra prints to be shipped, and they will also be able to improve the qual­ity and versatility of pre-feature advertising, as the practice of manually assembling platters is replaced by a computer-controlled playlist. However, this flexibility would come at a price, as theatres will be required to employ new IT specialists to service the networks and equipment.

Another major drawcard of digital cinema is that it should enable exhibitors to broaden their operations from exclusive reliance on feature films towards a more varied menu of attractions. Networked theatres able to display digital content delivered by cable or satellite can also use their screens for broadcasting live events. However, while isolated experiments with premium sporting events have been successful, there are significant hurdles to live content becoming a regular revenue stream. As Karagosian has pointed out:

Sports would be really neat, except that sports has much better return for the buck on a satellite TV network. It’s copyrighted and it’s time valued … Sports seems so interesting, but it’s probably not practical, except for very special occasions. (2002)

Moreover, increasingly those ‘special occa­sions’, such as World Cup football, are available on large screens which are located not in cinemas but in public spaces. Other mooted alternative uses of digital cinemas are multiplayer interactive gaming and 3D projection. The advantage of gaming is that it could exploit the high-quality image and sound facilities of theatres, while potentially attracting audiences outside of peak times. Digital technology has also given 3D cinema- the perennial ‘next-big-thing’—another lease of life. Following the strong support given to 3D films by leading directors such as Cameron and Lucas at trade convention ShoWest in 2005, 3D-equipped cinemas experienced rapid growth in 2006.

However, alternative uses of theatres such as gaming and 3D remain in their infancy. Given the significant cost premium on high- resolution digital projection systems com­pared to 35mm projectors, the limited take- up in most territories is unsurprising. In mid-2006, Australian exhibitors were still expressing a high level of scepticism as to whether a roll-out of DCI-mandated facilities made economic sense, particularly given that 1.3K resolution projectors were now available for around $11,000. In fact, low-end digital projection systems used for pre-show advertising have already made significant inroads into cinemas. The business case for low-end digital projectors has been far more compelling to exhibitors, with smaller upfront costs directly offset by increased revenues flowing from flexible content and more precise targeting of specific audiences. Karagosian argues that cinema becomes proportionally more attractive to advertisers as viewers continue to drift away from network television:

Electronics offers this whole level of interesting management of who you can target with your advertising. What that means to the exhibitor is more bucks for that ad, because of the higher quality of those eyeballs he can sell. (2002)

One of the long held dreams around digital distribution and exhibition is that it will enable a democratization of cinema as thresholds to market entry diminish. There are a number of different dimensions to the issue. Web-based distribution has clearly become an increasing presence for both legitimate (Atom Films, Movielink, CinemaNow, iTunes and so on), as well as for ‘pirated’ exchange using peer-to- peer software. However, access to theatrical screens is a different issue. Film distribution remains a highly concentrated industry, with the seven major Hollywood studios dominat­ing global revenues. Karagosian argues:

Let’s say you’re a filmmaker and you have a great piece of art that you’re proud of and you’re not sure whether anybody else is. But you’re proud of it and you want to distribute it. So let’s say there’s a network to 10,000 screens in the US, what’s compelling to them? Why do they want to look at your movie? Why do they want to put it out there? What you will need is an enabler, a distributor of some kind who you cut your deal with and who is able to take your movie and put it out. I think it’s going to be a well-controlled channel. It’s not going to be open, not because the electronics won’t allow it to be open, but because there isn’t a compelling reason for people to just be able to play any kind of content. (2002)

Karagosian’s analysis is confirmed by economists such as Abraham Ravid who argues that the major studios are currently undergoing an extensive functional shift as they morph into entities more akin to book publishers, less concerned with production than with marketing and distribution (2005: 54). Nevertheless, the rapid growth in networks of low-end digital projectors used to screen pre-show advertising in cinemas raises the possibility that, even in the absence of a broad shift to high-definition digital exhibition for mainstream feature films, there is new scope for screening alternative content such as films originated on DV.

This possibility is facilitated by another aspect of digitization: that is, databases. Karagosian highlights the role of compa­nies such as Hollywood Software, which pioneered distribution software for break­out films like the Hi-8-shot The Blair Witch Project (Daniel Myrick/Eduardo Sanchez, US, 1999) and My Big Fat Greek Wedding (Joel Zwick, US/Canada, 2002):

[Y]ou can go in with your movie and say it’s a drama, it’s this long, geared for this age group, what screens would this best show on in the US? This database can come back and say it’s going to show best on these screens. I want to show it for about three months, what time base can I get?. And it goes on down the line. So these tools help you optimise where you’re going to get your biggest bang for your buck historically from your class of movie in the theatre. Previously this task would’ve taken a large team of people to get on the phone and find out the availability of every cinema chain around. Suddenly you’re learning all this information from databases and very smart software. (2002)

Security, ‘Piracy,’ and IP

Digital distribution of films immediately raises the contentious issue of content secu­rity. Because of the extremely high ratio between film development costs and digital reproduction costs, as well as the film indus­try’s reliance on sequenced release windows cascading through different territories and across different platforms, film is peculiarly susceptible to ‘piracy’. For this reason, peak industry bodies such as the Motion Picture Association of America (MPAA) have taken a leading role in public debates and polit­ical lobbying around copyright protection. From former MPAA Chairman Jack Valenti’s mantra, ‘If you can’t protect what you own, you don’t own anything’, to his successor Dan Glickman’s assertion that ‘Protecting intellec­tual property will become a resounding theme for our economy in the decades to come’ (MPAA, 2005a), the MPAA has been active in promoting legislative, technological and behavioural change. Heightened concern with copyright protection and intellectual property (IP) reflects a broader economic shift away from trade in goods towards trade in services, knowledge and information. The emergence of new regulatory regimes since the 1980s also reflects a transition from what John Frow (2000) calls a ‘development’ framework for regulating transnational knowledge exchange to a ‘trade’-based framework.

Copyright infringement has been a concern for the major studios at least since the arrival of the VCR in the 1980s. However, digital technology has vastly increased the scale of copying. The MPAA put the direct worldwide cost of piracy to film producers at US$6.1b in 2005, and the worldwide loss to the motion-picture sector (producers, distributors, theatres, video stores, pay-per- view operators) at a staggering US$18.1b (seeMPAA, 2005b). Digital technology has also increased the complexity of the issues involved. The standard means of controlling the ability of users to modify traditional consumer goods has been to restrict physical access to them. Once someone has purchased (for example) a car, it is nearly impossible to prevent the user from ‘tinkering’ with it. In the analogue past, when one purchased a book or musical recording, there were few restrictions on what one did with the physical object. Unauthorized republication of content could be dealt with through copyright law and private copying was generally so laborious and the results were of such dubious quality that there was little effort to police it except in exceptional cases.

However, the ‘perfect’ nature of today’s digital copies, coupled with the increasing speeds of networks enabling widespread peer-to-peer distribution, has altered this situation radically and irrevocably. Fear about the ‘Napsterization’ of film content has meant that the first line of defence against copyright infringement has moved from legal prohibitions to technical prophylactics. Since unrestricted access to digitally-stored information inevitably enables high-quality copying, embedded technical protection mea­sures such as Digital Rights Management (DRM) schemes become ‘logical’ responses.

At a technical level, films released on DVD generally have encrypted content that requires a ‘key’ for access. In order to get the key from the content owner, DVD-player manufacturers are required to sign a license mandating design standards which specify limits to user interaction with the copyrighted work. In addition, manufacturers must also comply with ‘robustness’ requirements which make their technical mechanisms harder to circumvent. Technical fixes have been supported by legislative changes such as Digital Millennium Copyright Act (enacted in the US in 1998), which made it illegal to break encryptions such as the Content Scramble System (CSS), and also outlawed devices designed to circumvent technical protection.

Of course, effective operation of DRM schemes requires both technical and leg­islative implementation. Legislation remains essential to content owners since, despite their market power, it is difficult to get every manufacturer to agree to technical standards that restrict consumer rights. As digital television became an increasing reality, the same coalition of film studios and technology manufacturers which developed CSS pro­posed the ‘broadcast flag’ as a means of limiting the copying of broadcast material. As Tarleton Gillespie notes, such mechanisms significantly limit user agency in relation to computer technology by excluding the operation of the technology from the users’ scrutiny. They also have major potential ramifications for the future of software design. In their response to the Federal Communi­cations Commission concerning the proposed bill, the Electronic Frontier Foundation (EFF) argued that ‘robustness’ requirements would effectively lock Open Source designers out of the emerging market for digital television software:

To the extent [that] the open source development model embraces the freedom to modify, however, this necessarily means that open source software cannot be made ‘tamper-resistant’ in the fashion contemplated by the broadcast flag mandate. (EFF, 2004)

As well as technical and legal measures, content owners have conducted sustained pro­motional campaigns to alter social behaviour.

The legal cases conducted against key peer-to- peer distribution sites and high-profile users have been supplemented by the widespread use of the rhetoric of ‘piracy’ to change perceptions about file sharing, and to legit­imate the shift to a more common use of criminal sanctions. The MPAA website now carries a wide range of material directed at parents, teachers, and students, arguing the case against ‘piracy’, as well as promoting ‘legal’ online distribution sites.

Theorizing Digital Aesthetics

Alongside the transformation of production, distribution and exhibition, digital cinema represents a fundamental change in the nature of the film image. Trinh T. Minh-Ha argues:

What is at stake is the difference that begins at the core, with the formation of the image itself. The film image is something well-defined, that one can touch: a still, a frame, a rectangle, a piece of celluloid. Whereas with video, there is no real stasis, no ‘still’ in other words; the image is in perpetual formation, thanks to a scanning mechanism. Such a distinction can radically impact the way we conceive images, which is bound to differ from one medium to another. Rather than experimenting with sequences of stills, one is here working, on both macro and micro levels, with the pulses of an ever-appearing and disappearing continuum of luminous images. (2005: 201)

Following hard on the heels of the ‘death of photography’, digital cinema was rapidly associated with the death of film narrative. In the special ‘Digital Cinema’ issue of Screen, Sean Cubitt noted a ‘common intuition among reviewers, critics and scholars that something has changed in the nature of cinema—something to do with the decay of familiar narrative and performance values in favour of the qualities of the blockbuster’ (1999: 123). Lev Manovich aligned the predominance of ‘blockbusters’ with ‘digital cinema’ by defining the latter almost entirely in terms of increased special effects: ‘A visible sign of this shift is the new role which computer­-generated special effects have come to play in the Hollywood industry in the last few years. Many recent blockbusters have been driven by special effects, feeding on their popularity’ (1999). Paul Young discerned a ‘reliance on digital technology to produce spectacle at the expense of narrative’ (1999: 41), while filmmaker Jean Douchet expressed the shift in more extreme terms: ‘[Today] cinema has given up the purpose and the thinking behind individual shots [and narrative], in favour of images—rootless, textureless images—designed to violently impress by constantly inflating their spectacular qualities’ (quoted in Buckland, 1999: 178). Other writers such as Geoff King (2000) and Andrew Darley (2000) elaborated the theme into book-length analyses.

How should such claims be evaluated? In terms of commercial success, it is undeniable that the highest grossing films of the last decade have been dominated by genres such as action, adventure, fantasy, horror and science fiction, which make copious use of special effects. However, it would be reductive to treat this trajectory as if it was simply generated by digital technology. As Scott Bukatman (1995) noted in a thoughtful essay focusing on the work of special- effects pioneer Doug Trumbull, the shift to effects-laden films capable of producing a technological ‘sublime’ predates digital technology. Most analysts date the rise of the ‘blockbuster’ from the 1970s, with films such as Jaws (Steven Spielberg, US, 1975), Star Wars and Close Encounters of the Third Kind (Steven Spielberg, US/UK, 1977). As Bordwell notes, never before had films made so much money so quickly (2006:2). But none of these films employed digital effects, apart from Lucas’ successful adoption of motion control. The dominance of the blockbuster in contemporary cinema must therefore be situated in relation to other shifts in the film industry, including the emergence of multiplexes and megaplexes (Acland 2003), the growing importance of international revenues to Hollywood production, and the fact that film faces increasing competition from other entertainment platforms.

Nevertheless, digital technology has played an important role in this trajectory. As David Waterman notes: ‘While computer use in film making would seem to have real cost- saving potential, computer technology seems to have fuelled, not mitigated, rising film budgets’ (2005: 234). The average cost of Hollywood films, excluding marketing costs, has spiralled from US$3.1m in 1971 to US$60m in 2005. While there are many factors contributing to increasing production costs, including burgeoning ‘star’ salaries for bankable actors, producers, writers and directors, the increasing scale and complexity of film production is a significant part of the story.

While the sort of morphing effects that made T-2 an eye-opener in 1991 are now easily obtainable on PCs, Hollywood has consistently raised the bar, both in terms of the complexity and cost of individual effects, and the number of them packed into a single film. Stunts and special effects now regularly comprise 10-15 per cent of the budget of major studio films. Some far exceed this: The Matrix Reloaded (Andy Wachowski/Larry Wachowski, US, 2003) and Matrix Revolutions (Andy Wachowski/Larry Wachowski, US, 2003) devoted US$100m out of their joint US$300m budget to stunts and effects. Waterman contrasts the famous car chase in The French Connection (William Friedkin, US, 1971) to a comparable sequence in The Matrix Reloaded (2005: 216). Where the former involved a single stunt driver and the destruction of three or four vehicles, the latter included the construction of a six lane freeway at a cost of US$2.5m and the demolition of US$2m worth of cars. Another sign of this tendency is the rapid increase in the number of people involved in effects. Whereas the average number of end credits for special/visual effects for a major film in 1971 was 2.1, by 2001 this figure had risen to 150.

This inflation of effects budgets and crews suggests that spectacle, manifested in the dominance of genres such as action, adventure, sci-fi and horror, has become the currency whereby Hollywood ‘buys’ its lion’s share of the international film market, while also competing for audi­ences increasingly surrounded by options for their limited entertainment dollars, including mobile phones, iPods, subscription televi­sion and the Internet. As Independence Day (Roland Emmerich, US, 1996) proved, spectacular effects can drive an advertising campaign and cut across cultural barriers, enabling an otherwise ordinary film to perform extraordinarily well at the box office. In this respect, it can be argued that, while effects-driven blockbusters are not the sole or necessary outcome of digital technology, they have been an economically dominant tendency. Cameron has gone so far as to suggest that digital technology freed filmmakers from the constraints of the old ‘A’ and ‘B’ picture hierarchy:

[I]n the ’40s you either had a movie star or you had a B-movie. Now you can create an A-level movie with some kind of visual spectacle, where you cast good actors, but you don’t need an Arnold or a Sly or a Bruce or a Kevin to make it a viable film. (quoted in Parisi, 1996a)

If Cameron’s claim is overblown—most major effects-laden films are so costly to produce they demand star presence if only to secure the free advertising space routinely available to star-driven film promotion—it serves to highlight another aspect of the changing economic structure of the film industry, the growing role of ancillaries. The much-remarked decline of the proportion of total revenues derived from theatrical box office is partly a result of the emergence of new release windows such as subscription television, and sell-through and rental video and DVD. But it is also a function of the rising tide of new revenue streams, from toys, clothing and accessories to CDs, games, books and theme park rides. Ravid cites The Lion King (Roger Allers/Rob Minkoff, US, 1994) as a classic example of the ancillary tail wagging the cinematic dog (2005: 36). While it grossed a very respectable US$313m at the North American box office, US$454m abroad, and $520m in video, this billion- dollar-plus take was far outweighed by the US$3b it achieved in related merchandise sales. Such options multiply in the digital domain. As Lucie Fjeldstad, then head of IBM’s multimedia division, remarked at the time: ‘Digital content is a return-on-assets goldmine, because once you create Termina­tor 3, the character, it can be used in movies, in theme-park rides, videogames, books, educational products’ (quoted in Parisi, 1995). As David Marshall (2004) has noted, the digital era is one of heightened intertexuality. Digital convergence means that the labour used in designing CG characters for a film can also be utilized on a promotional website, in a videogame, or in off-shore factories manufacturing plastic toys. The commercial reality which is creating pressure for global day and date releases to combat ‘piracy’ of high-budget films, coupled to the need for alliances with partners such as fast-food chains and department stores for promotional movie tie-ins and shelf space to sell toys, suggest that the criteria for evaluating the ‘success’ of a film have altered significantly.

But has it ever really been all that different? Presenting the issue in terms of an opposition between spectacle and narrative recycles posi­tions which have been consistently articulated—and regularly reversed—throughout cinema history. Revisionist film history has success­fully challenged the stereotyping of early cinema in terms of its narrative ‘primitivism’, arguing instead that it catered for a different mode of pleasure and spectatorship that Tom Gunning (1986) influentially dubbed the ‘cinema of attractions’. In the 1920s, avant-garde filmmakers railed against ‘narrative’, because it was associated primarily with liter­ary and theatrical scenarios at the expense of cinematic qualities. Similar concerns emerged with debates over auteur theory in France during the 1950s, when the ‘literary’ qualities of script were opposed to the ‘properly cinematic’ qualities of mise en scene. In the 1970s, the ‘refusal of narrative’ took on radical political connotations in publications such as Screen, Cahiers du cinema, Framework, Wide Angle and Camera Obscura. In current debates there has been a widespread restora­tion of narrative as a filmic ‘good object’. Rather than attempting to resolve the issue in favour of one side or the other, the more salient need is to recognize that narrative and spectacle are inextricably intertwined. Attention then shifts to the examination of the sort of stories being told, and the sort of spectacles being deployed in their telling.

An early attempt to ‘periodize’ digital visual effects in science-fiction films was Michelle Pierson’s argument that early effects sequences were designed to stand out from the narrative and temporal flow by displaying a ‘hyperreal’ electronic aesthetic. Pierson suggests that after Jurassic Park a higher pre­mium was placed on the narrative integration of effects (1999: 169). While her argument was coherent from the spectator’s point of view, it ignored the production side of the process. As CG effects guru Scott Billups recalls, filmmakers had to educate computer programmers in order to achieve a ‘film’ look:

For years we were saying: ‘Guys, you look out on the horizon and things get grayer and less crisp as they get farther away’. But those were the types of naturally occurring event structures that never got written into computer programs. They’d say ‘Why do you want to reduce the resolution? Why do you want to blur it?’ (quoted in Parisi, 1996b)

Digital tools such as Flame gradually intro­duced ‘defects’ such as film grain, lens flare, motion blur and edge halation to make images look as if they might have been filmed. This suggests that it is not so much the ambition for narrative integration of CGI which has changed, as the capacity to realize that ambition. In the process, as Michael Allen (2002) notes, special-effects shots have gradually got longer as they have become more able to withstand spectator scrutiny. Once ‘live action’ images become more or less indistinguishable from CG images, Stephen Prince’s concept of ‘perceptual realism’ offers a useful refinement to the category of ‘realism’. In the context of digital imaging, Prince suggested that the operative ‘referent’ is less the real world than our audio-visual experience:

A perceptually realistic image is one which struc­turally corresponds to the viewer’s audio-visual experience of three-dimensional space __________ Such images display a nested hierarchy of cues which organise the display of light, colour, texture, movement and sound in ways that correspond to the viewer’s own understanding of these phe­nomena in daily life. Perceptual realism, therefore, designates a relationship between the image on film and the spectator, and it can encompass both unreal images and those which are referentially realistic. Because of this, unreal images may be referentially fictional but perceptually realistic. (1996: 32, my italics)

In other words, because of the extent to which audiences have internalized the cam­era’s qualities as the hallmark of credibility, contemporary cinema no longer aims to mime ‘reality’, but ‘camera-reality’. Recognizing this shift underlines the ambivalence of realism in the digital domain. A filmmaker’s ability to take the image apart at ever- more-minute levels is counterpointed by the spectator’s desire to comprehend the resulting image as ‘realistic’—or, at least, equivalent to other cine-images. This heightens the need to understand the way in which images are constructed into texts in order to achieve credibility, whether this is a function of ‘fiction’ or ‘documentary’.

One of the more interesting theses con­cerning the impact of digital technology on film narrative is put forward by Cubitt, who sees ‘technological’ cinema as an extension of what he calls the shift to ‘neo-baroque’ film (2004: 235). The neo-baroque is marked by the exploration of increasingly detailed diegetic worlds in which the temporal axis of narrative progress and resolution is displaced by spatialization:

Space succeeds time as organizing principle syn­chronously with neo-baroque narratives’ turn to the database form, a spectacularization of plot in an ironic mode in which mere coincidence satirizes the classical working-through of causes and their effects. As the lifeworld appears consistently more random, so the mediascape becomes more scathing at any pretence at order, mocking the revelations and resolutions that once passed as realistic. In the process, pattern is divorced from its old task of establishing morality. (Cubitt, 2004: 249)

For Cubitt, this shift raises an ethical problem, insomuch as the spatialized narrative ‘worlds’ of contemporary effects-laden cinema pro­mote a stunted intrapsychic narcissism which forecloses the relation to the other on which ethical communication is based.

Arguably the most influential attempt to theorize ‘digital cinema’ has been the various offerings of Manovich, which accumulated into his The Language of New Media (2001). Manovich describes an historical ‘loop’ in which the growing control over the image granted by digital technology has enabled the once marginal cinematic practice of animation to encompass the whole of cinema in the digital era. In this context, live-action footage and the ‘reproduction of reality’, proclaimed as the essence of cinema by theorists such as Siegfried Kracauer (1960) and Jean Mitry (1963-5; 1998), is dethroned to become simply another element in a general art of animation:

In retrospect, we can see that 20th century cinema’s regime of visual realism, the result of automatically recording visual reality, was only an exception, an isolated accident in the history of visual representation. (Manovich, 1999)

While the argument captures the way that the digital threshold enhances the fluidity of the film image, there are significant problems with Manovich’s formulation. One is that his understanding of ‘live action’ cinema in terms of its direct reliance on ‘physical reality’ is extremely reductive. Cinema has always involved the plastic transformation of physical reality, not least through the conventions of montage which became central to film narrative nearly a century ago. Dziga Vertov, whose Constructivist classic Man with a Movie Camera (USSR, 1929) is converted by Manovich into a precocious ‘database narrative’ that provides the leitmotif for his book, boasted that the film-eye could decompose and recompose physical reality at will. Yet Manovich ignores this long history of plasticity to set up a neat contrast between ‘automatic recording’ and ‘animation’. In this respect, Manovich’s position dovetails neatly with the stance of filmmakers such as Lucas, who has long proselytized the ability of the new technology to realize directorial vision:

I think cinematographers would love to have ultimate control over the lighting; they’d like to be able to say, ‘OK, I want the sun to stop there on the horizon and stay there for about six hours, and I want all of those clouds to go away’. Everybody wants that kind of control over the image and the storytelling process. Digital technology is just the ultimate version of that. (quoted in Magid, 1997: 52)

Ultimate control fits Lucas’ aesthetic, which places little or no premium on the peculiar attractions of ‘automatic recording’, whether of complex locations such as city streets or equally complex psychological terrains such as an actor’s face. (Star Wars lead Mark Hamill notes ruefully that, if Lucas could pro­duce films without actors, he probably would [Seabrook, 1997: 53].) However, erecting ‘film as animation’ into the totality of cinema in the digital age underestimates the persistent attraction of the chance effects of ‘live action’. Long ago Walter Benjamin eulogized the productive aspect of the encounter between the camera and the world in terms of the camera’s registration of the ‘spark of contingency’ (1979: 243), a value Roland Barthes later ontologized as the photographic punctum (1984: 47). If Lucas’ highly con­trolled cinema-as-animation is one dominant tendency of contemporary cinema, it finds various counter-tendencies in the continuing pull of location shooting and improvised performance, in the anti-interventionist stance of the Dogme movement, in the desire for ‘authentic’ social interactions in various ‘reality TV’ scenarios.

Film Studies in the Twenty-First Century

In one of my first teaching jobs in the mid- 1980s, I well remember having to lug a 16mm projector across campus in order to screen films. On one memorable occasion during Letter to Jane (Jean-Luc Godard/Jean-Pierre Gorin, France, 1972), I belatedly realized the take-up reel wasn’t functioning properly, and had to wind it by hand as metres of celluloid spooled across the floor. I was understandably nervous about potential damage, as it was probably the only copy of the film in Australia, sent by rail from the National Library in Canberra. Conditions of access are remarkably different today. DVDs and online databases have made it far easier to view certain texts, particularly current releases and ‘classics’. But this easy availability raises its own issues for Film Studies. In a world of ubiquitous digital images, what incentive is there for students to take the trouble of searching out and accessing films? And, even if we are prepared to trade off the specific materiality of celluloid for the rewards of broader accessibility, are we utilizing digital technology in the most effective way?

Patricia Zimmerman has argued persua­sively for the need to ‘re-imagine the borders of film history’ to include a broader range of film production than the existing canon (2001: 111). While the number of DVD titles is expanding, the release of historical material with limited market appeal is inevitably selective. Writing in 2004, Kay Hoffmann noted that only 100 films from between 1920 and 1928 were available on DVD, most of them from the US (2004: 161). Jan-Christopher Horak asks:

If only a limited canon is available for such classroom use, then only the canon according to Blockbuster will indeed be taught and shown to students. How do you teach a course on Third World cinema, on American independent documentary, on classical documentaries from the 30s, on avant-garde films from any period, when at present virtually no one is willing to finance their digitalization? (2003: 21)

This is the ‘dark side’ of the undoubted potential for digital media to enrich Film Studies, not only by improving access to films, but supplementing them with a range of ephemeral materials such as posters, scripts, design sketches and stills, as well as critical voiceovers and alternative language tracks.

One productive way in which digital technology might contribute to a broadening of the film canon is by facilitating new modes of access to established archives. Projects such as Moving History in the UK involve the construction of an online catalogue and guide to archive collections in the UK with the aim of promoting archival resources to educational users. Frank Gray and Eileen Sheppard express the hope that ‘an expanding knowledge of film-archive collections will influence not only the production of new histories of film and television, but also interdisciplinary research across the arts and humanities’ (2004: 116). They also point out that the viability of such projects depends on the allocation of scarce resources away from traditional tasks of archiving to educating end- users in negotiating technological differences, resolving copyright issues, and the like. Achieving best practice will depend on the extent to which film archives are able to gain wider recognition as a critical cultural resource.

The ambition for a broader array of disci­plines to make more critical and systematic use of film resources is paralleled by the new range of pressures on Film Studies as a coherent discipline. Digital convergence brings film into close relation not only with television, but the Internet, video gaming, and mobile media. As well as the need to explore both the continuities and specificities of different platforms, there is an increasing need to recognize that globalization demands more situated analyses of specific audiences. Attempting to fence off a domain specific to Film Studies in the context of digital media strikes me as an exercise in nostalgia which risks irrelevance, particularly to future stu­dents born after the explosion of the Internet in 1993. As those such as Henry Jenkins (2006) point out, fan culture has been transformed by the Internet, and has established such a degree of influence that major film productions no longer take place ‘in camera’, but with the enthusiastic participation of extensive fan communities. And while the demand for audience-configured ‘interactive’ narrative pathways has never seriously threatened to alter mainstream theatrical releases, user nav­igation has nevertheless emerged as a major driver of contemporary entertainment culture. If the burgeoning field of games studies does not belong entirely to Film Studies, it cannot remain entirely separate. The shift to subscription-based models for online gaming offers a far steadier income stream than the notoriously volatile film industry, where the mega-profits of one year can easily give way to catastrophic losses in the next. No doubt the major studios are watching closely, as much as they are watching websites such as YouTube, CurrentTV and MySpace. In this time of transition, it is worth remembering that the productivity of Film Studies has historically been its willingness to embrace and develop interdisciplinary methodologies. Innovation in both empirical research and theoretical paradigms seems more necessary than ever.