The Front Page: CGI Sucks!

Last week, James Cameron announced that after 10 years in post-Titanic exile (where, granted, he did produce a great many personal projects including Aliens of the Deep), he was smack dab in the middle of his next production, an ambitious sci-fi epic entitled Avatar. The storyline, rumored to center around a US solider sent to a far away planet to participate in its war, will be an ambitious undertaking, with live action elements mixing effortlessly with something the director calls “photo-realistic” CGI. In an interview with ‘Ain’t It Cool News’ honcho Harry Knowles, Cameron indicated that filming had already begun, and that he should have the initial elements wrapped up and completed by the end of this year.

Sounds like a sensational Summer of 2008 release, right? Wrong. In his talk with Knowles, Cameron went on to say that Avatar will not be arriving at your local Cineplex until sometime in 2009, if then. Apparently, the technology being used to render these amazing digital visions – extraterrestrials, space landscapes, intense battle sequences – will take that long to plan, perfect and render (they are being handled by Peter Jackson’s company Weta). Unlike other CGI, Cameron warns, the material in Avatar will be the next generation in visual effects, lifting the medium from its sloppy, Sci-Fi Channel Original Movie level leanings and more toward a successful melding of life with virtual reality.

As the geek contingency self-flagellates over the possibilities, and the inevitable sniping starts over carefully leaked storyline and character elements, the rest of the moviegoing public will have to wait another 24 months before discovering if Cameron is the next Stanley Kubrick, or just another run of the mill George Lucas. It’s a dazzling, daunting possibility. More than anyone else, the aforementioned 2001 titan brought serious science fiction to the realm of cinematic artistry. On the other hand, Mr. Star Wars has proven that CGI can be both a boon and a burden. From using the technology to revamp his original Trilogy, to relying on it exclusively to visualize his noxious prequels, Lucas, more than anyone else (with perhaps a little help from Jackson) has illustrated the main weakness inherent in the artform.

You see, when done right, CGI is a brilliant cinematic supplement. It presses out the creative creases in complicated sequences and adds an otherworldly pizzazz that standard cinema has a hard time replicating. When used in conjunction with other elements – set design, directorial flair, narrative complexity – it can lift a film into a realm where fantasy truly meets reality and easily co-exists. But when done incorrectly, when over-utilized and brutalized for the sake of some silly desire for more, more, more (read: the Lucas technique), you end up with…well, you end up with animation. Instead of something that resembles the world around us, the artificial nature of the medium pushes us out of the experience. Our eyes and our brain know it, even if the people behind the production don’t.

One of the biggest flaws in old George’s Vader-redefining films is the reliance on digital to create all the filmic facets – sets, props, creatures, action. No matter the attention to detail provided by Industrial Light and Magic and the talented artists employed, the human mind still responds with suspicion when images look too good, when they announce their intention to trick. Take the cityscapes used throughout the prequels. They look amazing with their gravity, physics and pragmatics defying dimensions. Buildings rise miles into the air, landing platforms jutting out like impractical parking ramps. The skylines shimmer with a paradoxical presentation of awe and ambiguity. We enjoy the eye candy treat, but take very little of cinematic sustenance away. Similarly, when all manner of mind-blowing creatures are carted out over and over again, sometimes for the sake of mere variety, we feel the need to disavow the dynamic.

That’s the problem with most current CGI efforts. From clunky beings that look worse than the earliest computer rendered experiments to obvious attempts to expand a normally nominal vista, the digital domain has turned the art of optical effects into a glorified ruse. It’s all smoke and mirrors, carefully crafted software and proprietary technology twisted into the most synthetic of cinematic styles. There are excellent examples of intricate incorporation. There are also models of meaningless modification. But the simple fact remains that a computer just cannot create the tactile, textural experience of well done physical effects.

A perfect example of a director who makes/made such an old school circumstance work, and work brilliantly, is Terry Gilliam. All throughout his breathtaking Ages Trilogy (Time Bandits, Brazil and The Adventures of Baron Munchausen) the ex-Monty Python animator and true creative genius forged fantastical wonders with puppets, perspective, miniatures, green-screen, and all manner of make-up and animatronic magic. From figuring out a way to feature star Jonathan Pryce in full atmospheric flight to rendering Python pal Eric Idle the fastest man on the planet, Gilliam conspired with his crew to create the impossible out of the practical. Students of the medium know all the tricks – the cotton matting clouds, the use of camera speed to suggest weight and heft, the application of motion control and intricate detailing to give items size and merit. In Gilliam’s talented hands, well crafted F/X aren’t fake or phony. Instead, they effortlessly merge with the overall vision the filmmaker follows, working to keep the audience locked well within the otherwise obtuse ideals.

The same goes for someone like Ridley Scott and his magnificent set of late ’70s/early ’80s epics; Alien, Blade Runner and Legend. As close to a perfect combination of movie and mannerisms ever created, Scott’s simple designs – to take viewers to places they’d never dreamed possible – are executed not with computers and programs, but with painstaking interaction between artists and the motion picture medium. From H. R. Giger’s definitive interstellar villain to the look of L.A. circa sometime in the far off future, the reliance on the real, not the bitmap and binary, gives these movies a richness and a realism that technology has yet to capture. Sure, Tim Curry had to go through Hell to take on the persona of The Lord of Darkness, his hours in the make-up chair challenging his patience and his health. But when the results are so resplendent as they are in Legend, when he is flawlessly lost inside the demonic dimensions of his character, it’s easy to excuse the sacrifice.

Other filmmakers like Tim Burton (with his effects style clinic called Beetlejuice) and Sam Raimi (delivering his demented Dead films without a single CGI supplement) equally established that even the cheesiest physical effect could work as long as the elements surrounding it matched the filmmaker’s motives perfectly. Even Cameron proved this with his stellar sequel Aliens. It’s impossible to imagine the movie’s climactic moment rendered digitally. It would seem silly for Sigourney Weaver’s Ripley character to gear up for her battle with the Queen Mother in a totally CGI robotic forklift suit. Call it reverse rejection. With physical effects, the eye sees the stunt, and starts scanning the image for imperfections. With CGI, the vision is so slick that we initially overlook its misdirection. But then the less than real aspects announce themselves, and we loose interest in the subterfuge.

It’s the biggest problem with modern computer graphics. Unless a great deal of time and care is taken in how a sequence is staged and rendered, the difference between a cosmic clash between warring interstellar factions and a Saturday morning cartoon become almost negligible. The mind can only register so much detail before the brain is boggled and begins to turn off. Sadly, individuals in charge of today’s slick science creations forget this, and try to pack as much intricate specificity into each scene as possible. That’s why Lucas’ arguments about “improving” his original Trilogy can’t stand. We believed the films when they first arrived in theaters, their sense of optical splendor a solid emotional memory for anyone who was lucky enough to see them back then. Now, they look tinkered with, taken to unrealistic lengths by a man who believes obsessively in the power of his microprocessors.

Hopefully, Cameron won’t fall into the same self-indulgent trap. He practically wrote the book on merging the physical with the computerized in his Terminator 2 and Titanic. But with this new mandate to dump the practical and move toward the totally digital, we could be witnessing another creative crash and burn from a filmmaker who should know better. Just because audiences bought the mostly IBM made Middle Earth and all its CG creations doesn’t mean that Peter Jackson’s auteur input should be diminished. After all, Final Fantasy: The Spirits Within was a semi-realistic rendering of sci-fi/ fantasy reality and you don’t hear fans harping over its filmmakers lack of an Oscar. No, cinematic skill needs to accompany the new tendency toward super computer creativity. The two F/X forms can live together in a kind of motion picture bliss, each one supporting and complementing the other. Maybe James Cameron is correct in taking the next two years to make sure his Avatar sets the standard for all computer graphics to come. If he fails, it will be another example of the invention usurping imagination for no good reason.