Bad movies are nothing new. Surely they date as far back as cinema itself; as soon as you have more than one of anything, you run the risk of one being “better” than the other. American film history is littered with some pretty pungent cinema stinkers—the hyperbolic Reefer Madness came out in 1938; the notorious flop I Take This Woman, with Spencer Tracy and Hedy Lamarr, emerged in 1940; and the nadir of Bette Davis’ career, Beyond the Forest, was unleashed in 1949.
But something seemed to have happened in the middle of the last century. Suddenly, around 1950, after 50 years of movie-making, we got worse instead of getting better at it. Every decade since has only brought with it more and more disastrous big screen bombs. In Harry and Michael Medved’s early tome to bad movies, The Golden Turkey Awards, from 1980, their appendix of the 200 worst film “achievements” of all time lists only four titles from before 1950 (The Big Noise, The Kissing Bandit,The Return of Dr. X, and the aforementioned Reefer Madness). It is also noteworthy that Wikipedia’s entry on the “Worst Films of All Time” doesn’t include any films made before 1953. It is also interesting that the Razzies didn’t see fit to exist until 1980. Furthermore, Wikipedia’s list of bad movies, separated by the decades, only gains in the number of films with each subsequent ten-year period.
Think movies aren’t getting steadily worse? Well, only three years into the 2010s, our current era has already been subjected to Movie 43, Bucky Larson, Battleship, and the Twilight series to fully clog up the theaters. Sad to say, the ongoing trend of full-on cinema schlock shows no sign of abating any time soon – at least not while Adam Sandler, Eddie Murphy, and Sly Stallone still have anything to say about it.
What happened?
Of course, movie making didn’t completely roll off the rails in the ’50s, not when films like Vertigo, On the Waterfront, and Singin’ in the Rain still managed to get made. Nor has any subsequent decade ever been without its fair share of true masterpieces. Still, something has surely gone askew for films like Gigli, Birdemic, and others to come to fruition.
One theory that perhaps explains the masterpiece’s decline and the mediocre’s rise is the easier access to filmmaking tools. Not only are more films being made today, they are also no longer the provenance of a few well-funded, carefully controlled studios out on America’s West Coast, which in many ways is a good thing. The narrow Hollywood scope has been expanded to include many quality films from many quality filmmakers. That’s a good thing. Such easy access to filmmaking tools can also be a bad thing. The Room, after all, did not have any studio backing, and Troll 2 was not a Paramount production. Suddenly—and getting easier all the time—anyone with a camera and some free time can make a movie.
Still, this does not help to explain big-budget, major movie debacles such as Jack and Jill, Showgirls, and Freddy Got Fingered. So what makes these films of recent vintage seemingly so much worse than subpar, sub-B programmers of the ’30s or ’40s? And why do we keep churning out so many of them year after year? I have a couple of theories.
Beginning in the ’50s and gaining momentum every year ever since movies have had to “up their game” (so to speak) to compete with television and its siphoning of customers out of theaters. To entice audiences back to the theaters (and eventually the Cineplexes), movies have—for better or worse—worked hard to be more “adult” – amping up violence, sex, nudity, language, and other so-called “adult themes” until they are either on the “cutting edge” or, at least, far removed from the realm of what could ever be shown on non-cable television.
The “adult” language, gross-out shtick, scatological references, sexual entendre, and anatomical focus that now populates many of the worst films coming out of Hollywood—from Spring Breakers to The Paperboy—often makes these films disturbing, not just stupid or slow-paced, to sit through.
In contrast, films of the ’30s and ’40s, which can be silly, boring, and dumb, are seldom over-the-top offensive. Even the racial/ethnic stereotypes that populate many vintage films can be considered by placing them within the time they were produced and the dominant culture controlling the narrative. Current films—films from the past 20 years or more—have no such excuse. Hence, the greater disturbance (and outrage) when viewing Asian stereotypes in The Social Network or encountering the curious case of Jar Jar Binks in The Phantom Menace.
Bigger budgets and filmgoers’ greater knowledge about the filmmaking process (especially its finances) have further altered how we ultimately judge the films we see.
Granted, making movies has long been a costly endeavor. Allegedly, 1916’s Daughter of the Gods, with swimming star Annette Kellerman, was the film world’s first million-dollar movie. It was followed by Foolish Wives in 1922, costing $1.1 million, and When Knighthood Was a Flower, also from ’22, at $1.5 million. This was back when a million was still a million.
But, today, thanks to the internet and publications like Entertainment Weekly, movie budgets get reported and dissected (think Waterworld) long before the finished product makes it to the big screen. Were the ledgers of Gone With the Wind and The Wizard of Oz so well known before they premiered?
Runaway movie costs (and star salaries that often bloat them even more) is also a post 1950—in fact, post 1960—phenomenon. In James Robert Parish’s 2007 book Fiasco: A History of Hollywood’s Most Iconic Flops, Cleopatra (rightfully) is the book’s kick-off chapter—and the earliest film discussed. It was produced in 1966. Of the other 12 movies profiled (Town & Country, Cutthroat Island, Ishtar, Cotton Club,Last Action Hero, et.al.) all but were produced after 1980.
Precognitive knowledge about a film’s costs and overruns can alter our movie going experience as we sit there in the darkened theater and wonder: How can anything that cost this much be this boring or this dumb?
Such blatant conspicuous consumption disturbs us on a level far more basic than just how much we paid for our movie ticket. Oh, the wretched excess and utter waste of it all! And it is this differential that allows us to more readily forgive—perhaps even find affection for—smaller, independent bad films like The Room or Plan 9 than we can ever muster for a Heaven’s Gate, a Battlefield Earth or the latest near-$1 billion Bruce Willis would-be blockbuster.
It’s rather quaint now to remember back to the runaway success of The Blair Witch Project in 1999 and how its low-budget creativity, along with its internet based ad campaign, was supposed to change film forevermore. Well, that didn’t happen. Disney’s infamous John Carter from last year showed that runaway budgets (and bad ideas) are still part-and-parcel of many Hollywood studios. In fact, more and more, films are flirting with budgets of $1 billion making not only their breakeven point riskier and riskier but audience expectations higher and more demanding.
Meanwhile, the race to see “how low can you go” continues apace, as Movie 43 undeniably proved. Age has not slowed down the Frat Pack, the unofficial club of (usually) bankable male stars (Vince Vaughn, Will Ferrell, Seth Rogen, et.al.) who have never grown up and women, considering last year’s Bachelorette, show signs of following in their footsteps. South Park creators Trey Parker and Matt Stone are still Hollywood heavyweights. And, lest we forget, Seth McFarlane just hosted the Oscars where he sang a song about “boobs.”
All indicators are that the movies will not be getting, collectively, any better any time soon. The ’30s might have been the Great Depression and the ’40s the Great War, but in regard to films, they are looking better and better all the time.