Dune: Part Two, Denis Villeneuve

A Case Against Two-Part Movies

With the rise of the two-part movies fad, even filmmakers of Denis Villeneuve’s caliber may be devaluing what makes the cinematic experience, well, cinematic.

This year’s American films saw the release of two major “part ones” that weren’t, in fact, first films at all. They’re both sequels: Joaquim Dos Santos’ Spider-Man: Across the Spider-Verse and  Christopher McQuarrie‘s Mission: Impossible – Dead Reckoning Part One (two titles maxing out their share of dashes and colons), were marketed not just as the next entries in a series, but as merely the first halves of those entries. It’s an increasingly common distinction that suggests the original films are fundamentally incomplete, like the two-part finalé of a television season. Next year will include their follow-ups but also Denis Villeneuve‘s long-awaited Dune: Part 2, Jon M. Chu’s Wicked: Part One, and both chapters of Kevin Costner’s two-part Western Horizon: An American Saga. Seeing a single film used to require just one trip to the theater; increasingly, it requires two.

Splitting so-called single films into two parts can be traced back farther than you might think. As early as 1924, Austrian director Fritz Lang released two Die Nibelungen films, each of which functioned more like one half of a serial than a separate movie. For the next several decades, however, the closest studios came to splitting up individual films was when they started producing discreet follow-ups concurrently. The sequels to Back to the Future, The Matrix, and The Lord of the Rings were all filmed back-to-back and essentially marketed as pieces of a whole, with future chapters being a foregone conclusion. Audiences started going to second films already knowing that they weren’t just part two; they were part two of three. Viewers adjusted their expectations accordingly. 

It was Quentin Tarantino, however, who established the model that Hollywood is still milking today. Kill Bill (2003) was originally conceived and filmed as a single film before the director decided to split his revenge saga into two volumes released six months apart. One of the fundamental problems with this strategy begins here: while Tarantino’s stated goal was to avoid an overlong single film, filling out two entries suddenly had, ironically, the same overstuffed effect. There’s a sense, especially in Kill Bill: Volume Two, of deleted scenes from the prior film being re-inserted, allowing for every indulgence at the expense of pacing. You could argue that more of a good thing is just that, but it’s also tempting to imagine a unified Kill Bill that’s simultaneously more disciplined and epic.

Still, at least Tarantino’s idea seemed more of an artistic than a financial choice. It would be six more years until Hollywood saw the real money-making potential of two and three-part films, with two blockbusters that would prove most consequential to the future of movie-splitting. The decision to separate Harry Potter and the Deathly Hallows into two films was partially practical: at 607 pages, it was the second-longest book in J.K. Rowling’s series. (Never mind that the longest one, The Order of the Phoenix, ended up as one of the shortest movies.) But the more irresistible angle for studios was turning one sure-fire hit into two just when they were about to run out of source material. The gambit was so successful that it was also used for the final entries in both the Twilight and Hunger Games series, whether or not they had four hours’ worth of story to tell. 

If greenlighting two or three films at once seems financially risky, it makes sense that the trend started with the most dependable literary properties with built-in fanbases; there was no way rabid Edward and Bella stans from the Twilight series were going to turn their noses up at an extra trip to the theater. But it wasn’t always a win-win for audiences and studios. Just ask fans of the Divergent series (they must be out there) who turned up for part one of the two-film finalé Allegiant, only for its planned second half to be scuttled due to poor box office. Surely, there are even fewer fans of the three-film monstrosity Peter Jackson made out of The Hobbit, which unfortunately was seen through to the end. There were still more hits than misses, but it became clear that audiences wouldn’t necessarily tolerate every short novel being spread dangerously thin over several movies.

So what could be even more of a guarantee, at least in the eyes of notoriously risk-averse studios? Enter Marvel. There was no surer thing in the late 2010s than the final, two-part Avengers films, both of which quickly became some of the highest-grossing movies of all time. What is the MCU anyway, if not one unending film stretching on as long as there are still butts in seats? There’s no better example of the half-film than the Russo BrothersAvengers: Infinity War, which ends with such a devastating whopper of a cliffhanger that is more galvanizing than frustrating. With that unqualified artistic and financial success, it became clear that what Harry Potter started had long outlasted its Young Adult origins. With the Spider-Man and Mission: Impossible films in the mix, any proven intellectual property is eligible. 

Does this fad in filmmaking signify the death of cinema? The question almost seems quaint, given how mainstream moviegoing has already long succumbed to sequels, franchises, expanded universes, and brand extensions. Yet, for those concerned with the theatrical experience, there is still something uniquely disconcerting about movie-splitting. Especially now that streaming services (and especially Marvel’s TV arm) are steadily blurring the line between “film” and “content”, movies have never felt less like standalone experiences. Why not stay home and wait until both parts are available to watch back-to-back? Why pay for two tickets, months or years apart? Who wants to go to the theater for something that doesn’t feel like a finished film?

The ending of Irvin Keshner‘s Star Wars: Episode V – The Empire Strikes Back is as open-ended a cliffhanger as they come, yet few would argue that it isn’t a fully satisfying, self-contained experience unless you watch Richard Marquand’s Star Wars: Episode VI – Return of the Jedi next. The problem is that few “first parts” pass that test. The current model, at its worst, has resulted in half-films that seem to just stop in the middle. Even Villeneuve’s Dune, possibly the most critically acclaimed part one yet made, feels fundamentally halved, lacking internal arcs or even a climax that isn’t wholly dependent on a second film. Frank Herbert’s 1965 sci-fi Dune was a single novel for a reason. As for the film, pausing smack in the middle for almost three years doesn’t do the story of the film any favors. If the lack of part two completely negates the existence of the first film, something has gone awry.

More concerning is that this trend is no longer confined to easily ignorable superhero films. Filmmakers of Villeneuve’s caliber may be inadvertently helping devalue what makes the cinematic experience, well, cinematic. Even dependable, bang-for-your-buck series like Mission: Impossible used to be appealing because it didn’t matter if you caught the last one or remembered where things left off. It’s much harder, though, to get excited about Dead Reckoning: Part Two without studying up on Dead Reckoning: Part One. Worse, the lack of a readily available part two might make audiences less eager to engage with part one.

Going to the movies used to be a lot more like enjoying a stand-alone Law & Order episode than trying to catch up with the entire story arc of Game of Thrones. As with television, we might only start missing the “case of the week” once it’s gone.

FROM THE POPMATTERS ARCHIVES