I did not begin 2014 by imagining that the most resonant movie moment of the 12 months to come would be a quiet, resigned stare-down in a bathroom. But it has been that kind of year. Alejandro G. Iñárritu’s Birdman tells the story of an actor trying to outrun a character that threatens to devour him. Birdman is his support system, his claim to fame, the devil on his shoulder, and the demolisher of his soul. In short, it’s his best and worst self, depending on his mood. The film is a flight of fancy, but it’s a very sad one; the wind beneath its wings is a crisp and rather bleak message about what movies are right now — and, perhaps, what they will continue to be for as far ahead as we can see.
At the end of Birdman (mild spoilers follow; skip this paragraph if that freaks you out), Riggan Thomson has come through the fire, scathed but still standing. An actor best known for his starring role in a trilogy of superhero movies, he has spent Birdman’s running time trying to fluff up the feathers of his dampened ego by questing after legitimacy, which he hopes to find in a serious Broadway play of his own devising and direction. Riggan has poured sweat, tears, and, at the climax, a copious amount of blood into his art — or maybe just into his narcissistic need to be respected — and he’s won, but at no small cost. Now, in a hospital bathroom, he finds himself face to face — or really, beak to beak — with not only his own remade reflection but Birdman himself, who has, in full costume, made himself comfortable on the commode in a way that reawakens questions about certain superhero practicalities that have crossed the mind of every kid who ever read a comic book. So there they are, the two of them, taking each other in: silent, irritated, perplexed. Whatever kind of contest between Riggan and Birdman we’ve just been watching, it seems to have ended in a futile tie.
What does this moment mean? In the movie’s artistic scheme, it means Riggan can’t ever truly free himself of the needy, frail ego that his sturdy, gruff alter ego represents. But if you work in or follow the movie business, what we have here is a grim joke with a grimmer punch line: There is no escape anymore. We will never get away from Birdman, even as he threatens to poop all over everything. If movies have, for a century, been the repository of our dreams, and every generation gets the dreams it deserves, then ours is Rodin’s The Thinker reimagined as a superhero poised on the edge of the crapper, and the rest of us poised on the edge of … well, it may be a little extreme to invoke the abyss. But we’re on the edge of something, and the something is big and dark and annihilating. So call it what you will, but come up with a name fast, because we’re all about to get sucked in.
Birdman opened in the United States on October 17. It is not, financially, a consequential movie; it was released by Fox Searchlight, the prestige, indie-ish subdivision of 20th Century Fox; it has made decent money (nearly $21 million domestic) for the kind of film it is; it will win a bunch of awards over the next three months and lose a bunch of others. It is a good movie, but the type of good movie it is has nothing to do with what the movie industry is about. What the movie industry is about, in 2014, is creating a sense of anticipation in its target audience that is so heightened, so nurtured, and so constant that moviegoers are effectively distracted from how infrequently their expectations are actually satisfied. Movies are no longer about the thing; they’re about the next thing, the tease, the Easter egg, the post-credit sequence, the promise of a future at which the moment we’re in can only hint.
So it’s appropriate that the two most important movie events of 2014 weren’t movies at all, but rather what amounted to a pair of live-action trailers. The first came at a Time Warner investors’ summit, when chairman/CEO Kevin Tsujihara announced a slate of 10 Warner Bros. movies based on DC Comics characters to be released between 2016 and 2020. The second came two weeks later, when Marvel Studios chief Kevin Feige took the stage of Hollywood’s El Capitan Theatre, at a fan-service event that had every bit of the importance and money-consciousness of a shareholders’ meeting, to announce Marvel’s “Phase 3”: the nine movies, to be released over the same stretch as DC’s lineup, that will follow the 12 movies in “Phase 1” and “Phase 2.” (The final two of those will be unleashed next year while Warner Bros. is still nudging and prodding its tardy superheroes toward the starting gate.) Add in the Marvel properties — Spider-Man, X-Men, the Fantastic Four — that are owned by other studios, and if you’re a comic-book-movie fan, the next six years of your multiplex menu will look like this:
Depending on how that makes you feel: Yay! Or: Hmm. Might there be anything else to eat?
The term “comic-book movie” is convenient shorthand for the films on that list, but perhaps its time has passed. It suggests there is a bright line separating DC and Marvel films from most of the rest of what you will see between now and 2020. In fact, these 34 films are on a continuum with five Star Wars movies (a trilogy and two stand-alones), three Avatar sequels, three (more) Terminator films, three (more) Lego movies, and a trilogy of movies based on Fantastic Beasts and Where to Find Them, a book by J.K. Rowling that is 42 pages long. Here is a list of sequels and franchise installments — 70 of them at current count, although the actual number will, of course, be much higher, probably more than 150 if the 2015 lineup is any indication — that are set to open over the next six years.
The movie business has never seen the likes of these lists before. They are the beginning of something. They are also the end.
This would be an apt place for me to deviate into a gravelly Gran Torino old-man rant about the permanently arrested, riskless nature of our culture — how everything in modern major moviedom is now derived from material meant for children or adolescents and aimed at adults desperate to remain in that state well into chronological maturity. But that’s not new. Giving the people what they want over and over again until they don’t want it anymore is an idea, and a business model, that is almost as old as the movies themselves. Laurel and Hardy first teamed up before the advent of sound. For a decade beginning in 1937, the Andy Hardy series was a profit machine for MGM, and between 1941 and 1955, Abbott and Costello churned out 27 movies for Universal, helping to turn that second-tier studio into a major player. The biggest and most successful of those series have always put money in the bank — money that got used to make movies that people who make movies dream of making, and that people who see movies dream about after they see them. I don’t want to romanticize that era: Writers were schmucks with Underwoods, and Harry Cohn judged a movie’s length by whether it made his ass hurt, and Jack Warner by how many times he needed to get up to pee. But there were differences. And one was that, while franchises were occasionally an end, they were also a means.
The arrival of Star Wars in 1977 changed everything — but despite a still-popular industry narrative, it didn’t change everything into what it is now. The revolution of George Lucas’s game-changer — in purely financial terms — was that it confirmed what the James Bond series had suggested a decade earlier: There was no ceiling on how much money the right kind of series with the right kind of potentially escalating fan obsession could take in. Over the 25 years that followed Star Wars, franchises went from being a part of the business to a big part of the business. Big, but not defining: Even as late as 1999, for instance, only four of the year’s 35 top grossers were sequels.1
That’s not where we are anymore. In 2014, franchises are not a big part of the movie business. They are not the biggest part of the movie business. They are the movie business. Period. Twelve of the year’s 14 highest grossers are, or will spawn, sequels.2 (The sole exceptions — assuming they remain exceptions, which is iffy — are Big Hero 6 and Maleficent.) Almost everything else that comes out of Hollywood is either an accident, a penance (people who run the studios do like to have a reason to go to the Oscars), a modestly budgeted bone thrown to an audience perceived as niche (black people, women, adults), an appeasement (movie stars are still important and they must occasionally be placated with something interesting to do so they’ll be cooperative about doing the big stuff), or a necessity (sometimes, unfortunately, it is required that a studio take a chance on something new in order to initiate a franchise). A successful franchise is no longer used to finance the rest of a studio’s lineup; a studio’s lineup is brands and franchises, and that’s it. Disney,3 of all the big companies, is the closest to approaching the absolute zero of this ideal — its movies are virtually all branded, whether Lucasfilm, Pixar, Marvel, or Walt Disney Studios — and anyone who doesn’t imagine that other studio CFOs are gazing at that model in envy and wonder is delusional. Disney is a kingdom of subkingdoms. Nothing minor or modest need apply.
Alberto E. Rodriguez/Getty Images
Every generation of studio titan is less apologetic than the one before. In Hollywood’s first golden age, the studios were run by shrewd, canny, undereducated first- or second-generation Americans — crass businessmen who claimed to love Art; they maintained offices in New York City where they deputized aides to elevate the tawdry screen trade from lowbrow to middlebrow by purchasing legitimacy in the form of the best-reviewed new novels and most acclaimed Broadway plays. God bless them every one; their aspirationalism built Hollywood. Decades later, once movies themselves had become a central and even revered part of the culture, that era of mogul gave way to a new breed: the bottom-line businessman who loved movies as long as they were entertaining, but was still willing to reassure doubters that he occasionally liked Art, too, up to a point and in small doses. They still had enough old-Hollywood DNA to feel an obligation to tithe: A modest portion of their profits would always be used to take gambles on the kind of great hopes that didn’t always translate to bottom-line success.
Today we have a different model: The modern studio chief loves business, success, replication, and reliability, and nobody expects him to offer even the most cursory nod to anything that smacks of ideals that relate to content; that’s not what he’s there for. Tsujihara has an MBA from Stanford. He started out managing Time Warner’s interest in Six Flags theme parks, then moved to home entertainment, and early last year took over the whole business. He has never produced a movie; in fact, he is the first studio head to rise in the ranks purely through brand extension and ancillary divisions, and brand extension is what he’s all about. Besides the DC announcement, his big accomplishments have been to nail down those three additional Rowling movies to add to the studio’s portfolio of eight, and to turn one Lego movie into four — a ninja Lego movie, a Batman Lego movie, and (for purists, I suppose) The Lego Movie 2. This is what successful purveyors of goods do; they make more of what sells, they cull what doesn’t from the lineup, and they seek to create products in which quality-of-execution variability is never going to be too much of a wild card. MGM’s old, gloriously lofty motto was “Ars Gratia Artis”; today, the only thing written in invisible ink on every studio gate is “More of What Works,” a credo that would be right at home at the entrance to any manufacturing plant.
Tsujihara’s counterpart at Universal is Jeff Shell, who became the chairman of Universal Filmed Entertainment Group a year ago. He has never made or overseen a movie either; he emerged from the corporate division of Comcast, the Philadelphia-based cable company that now owns Universal, and soon after he got the job, one executive sighed to Variety that “he doesn’t understand the big dreams or the big risks.” That may be true, but it is equally true that those are clearly not among the job qualifications. What counts is that Shell understands the importance of a fourth Jurassic Park movie as a gateway to more, and the importance of the Matt Damon Bourne and the Jeremy Renner Bourne becoming a two-stranded franchise, and understands that, with the seventh Fast & Furious movie just months from opening, it is not too soon to think about Furious 8 through 10, or of Minions, or of China, which, in current multinational-conglomerate terms, is the big dream.
Guys like Tsujihara and Shell are the future disguised as the present. Right now, they exist atop studios side by side with rivals who came up the old-fashioned way, like Alan Horn, who worked in television production, cofounded Castle Rock, went on to run Warner Bros., and now chairs Walt Disney Studios, or Amy Pascal, who brought decades of production and development experience to her 12-years-and-counting chairmanship of Sony’s Motion Picture Group. It’s easy to look at Tsujihara and Shell in one column and Horn and Pascal in another and simply conclude that there are a lot of different types of people who can run a studio and make it work. But that’s not what’s going on. What we are witnessing is not stability but transition — the evolutionary moment of overlap in Hollywood when the old way and the new way transiently coexist. Ten years from now, the old way will be gone. The new way will simply be the way.
Back to those long lists of movies. Historically speaking, Hollywood studios have not traditionally been (certain personality types aside) Stalinist. Five-year plans, let alone seven-year plans, were, for a long time, anathema to a business that wants to be able to roll with whatever the hot new thing is. A decade or so ago, industry observers scoffed at the cart-before-the-horse presumption of the first studio to grab a release date a couple of years down the road for a movie that wasn’t even written or cast yet. But what then screamed of misplaced priorities now counts as best practices. The notion that the movie (or even the idea for the movie) should come first is quaint. It now seems perfectly reasonable to studio heads to announce the general size, shape, and noise level of the product with which they plan to flood the market on a certain date; they’ll worry about what’s going to fill the box down the road.
TV looms large over this new movie lineup. How could it not? TV is everything. TV is how people see movies; TV is where people want to watch movies, on demand and on their own terms; TV is what Twitter wants to talk about. Most of all, TV knows how to keep people coming back, which is its job, every day and every week, and is a quality that, above all others, the people who finance movies would dearly love to poach.
But TV is also better at being TV than the movies are. Movies can do many things, but they can’t replicate that kind of sustained engagement well; they’re too big and lumbering and the pauses between them are too long. For years, Warner Bros. has been trying to make a movie, or maybe two, out of Stephen King’s long novel The Stand. Recently, its latest director, The Fault in Our Stars’s Josh Boone, floated the idea that it could be four movies. That is not a movie; it’s a miniseries. In fact, I’m not clear on how it differs from the four-part, eight-hour ABC miniseries that was made of The Stand in 1994 except for the fact that it’s going to be “epic” (which is what we now tend to say when we mean some Peter Jacksonian combination of long, loud, and slow to climax) and will force us to wait a year between each installment rather than a night. That is not necessarily an improvement.
How do you import TV’s essential quality to the big screen without sacrificing the sense of immersive, self-contained completeness that for decades has been a central element of the movie experience? This summer, Disney announced The Avengers 3 and The Avengers 4. They will be called Infinity War Part 1 and Infinity War Part 2. Suddenly, the stakes for Avengers 2 (which is merely called Age of Ultron) feel lower. How could they not, when we know the freaking Infinity Wars are coming? Whatever arrives before that is just a chapter, an episode, an installment. Late next summer, after Age of Ultron, Marvel will release its next official Marvel Cinematic Universe movie, Ant-Man. At Marvel’s fan event this fall, Feige explained with affable precision that Ant-Man is an end–of–Phase 2 movie, not a beginning–of–Phase 3 movie. I’m sure he didn’t mean it like that, but the statement was, in its quiet way, one of the worst things I have ever heard an executive say about one of his own films. It felt like what was being said was, “Yeah, yeah, 2015 … BUT REALLY 2016!” In a culture based entirely on looking way, way ahead, next year, it seems, is already so one year ago.
I believe that what studios see when they look at the bumper-to-bumper barricade of a 2015–20 lineup they’ve built is a sense of security — a feeling that they have gotten their ducks in a row. But these lists, with their tremulous certainty that there is safety in numbers, especially when numbers come at the end of a title, represent something else as well: rigidity and fear. If you asked a bunch of executives without a creative bone in their bodies to craft a movie lineup for which the primary goal is to prevent failure, this is exactly what the defensive result would look like. It’s a bulwark that has been constructed using only those tools with which they feel comfortable — spreadsheets, P&L statements, demographic studies, risk-avoidance principles, and a calendar. There is no evident love of movies in this lineup, or even just joy in creative risk. Only a dread of losing.
At this point, optimists usually say lighten up, because, after all, good movies always find a way to get through. But here’s the thing: They don’t. The evidence that good movies survive is the fact that every year brings good movies, which is a bit like saying that climate change is a hoax because it’s nice out today. Yes, good movies sprout up, inevitably, in the cracks and seams between the tectonic plates on which all of these franchises stay balanced, and we are reassured of their hardiness. But we don’t see what we don’t see; we don’t see the effort, or the cost of the effort, or the movies of which we’re deprived because of the cost of the effort. Paul Thomas Anderson’s Inherent Vice may have come from a studio, but it still required a substantial chunk of outside financing, and at $35 million, it’s not even that expensive. No studio could find the $8.5 million it cost Dan Gilroy to make Nightcrawler. Birdman cost a mere $18 million and still had to scrape that together at the last minute. Imagine American movie culture for the last few years without Her or Foxcatcher or American Hustle or The Master or Zero Dark Thirty and it suddenly looks markedly more frail — and those movies exist only because of the fairy godmothership of independent producer Megan Ellison. The grace of billionaires is not a great business model on which to hang the hopes of an art form.
And that leaves aside the movies that aren’t getting made — the original scripts or adaptations of novels that make it into development and then go nowhere, with apologies and sighs from the people who bought them, unless a very big studio feels like doing a very big favor for a very big star, director, or producer. People who believe in the primacy of the marketplace will tell you that this winnowing process is a Darwinian, survival-of-the-fittest thing. It’s not, but even if it were, are we really supposed to cheer creative decisions that are based on nothing more than a nervous determination to avoid extinction? Yes, some good movies get through, but many that once would have now don’t, won’t, can’t. And a generation of midlevel executives that in the not-too-distant past would have been trained to develop and champion them now knows that doing so isn’t the way to move up in the ranks; these days, you make your bones by showing you can maximize the potential monetization of a preexisting brand or reawaken a dormant one. Stand-alone, non-repeatable hits are nice, but only in an outside-the-system way; they’re for people who don’t know how to think big.
In the Hollywood universe of the 1940s, movies could go from being green-lit to neighborhood theater in six months and be rewritten and tweaked to reflect the headlines just weeks before they opened. Many of those movies were intentionally disposable, and their creators would have guffawed at the notion that, 70 years later, we would still talk about and study and enjoy them. That world is long gone, and so is the world of the early 1970s, in which the hedonism and darkness and anxiety and madness of the moment seemed to ooze between the theater and the evening news and the morning headlines in a kind of real-time dialogue. Today, as obsessed as the studios are with the quick pivots of television and the hyper-responsiveness of social media, they don’t particularly want to make movies that reflect this moment; they understand that there’s less danger, and believe that there’s more profit, in making movies that retreat almost completely from any kind of temporal or cultural specificity that might feel short-lived or exclusionary.
Elasticity should be considered a virtue, right? The ability to swing with the times and not insist on viewing the changeable tastes of your target audience as something to be thwarted or bulldozed should make for a more vibrant and surprising menu. It always has. But that kind of cultural responsiveness is something the string-pullers of current movie culture actively seek to avoid. As with prepackaged food, exportability and shelf life are now primary virtues. The product Hollywood is selling right now keeps better if it contains as few organic ingredients as possible — whether organic to the place, the mood, the news, or the moment. Think of the major Hollywood studio movies you saw this year. Aside from their up-to-the-nanosecond technological razzle-dazzle, how many of them felt like they belonged specifically to 2014, as opposed to five, 10, or 15 years ago? Or, for that matter, five years from now?
So about that abyss. Let’s lay aside that metaphor, although I’m not convinced it’s the wrong one, and imagine that we can all forge a new world, even if it is, à la Interstellar, uncomfortably close to the edge of a black hole.4 But let’s at least acknowledge that we are witnessing the passing of something. You can weep or salute or shrug or refuse to look as it goes by, but it’s going by. Those long release calendars above may not constitute an obituary in themselves, but peer deeply enough into them, and you can see some indices of the cause of death. Of course, I’m aware that some of those movies — more than two, fewer than 20 — are likely to be really good. And that there will still be movies that excite those of us who tend to get excited by the kinds of movies that will never appear on lists like those. But consider how much of Hollywood’s collective effort and money and insistence and attention that roster is going to consume, and I think that if you love movies, you have to sigh a little. And if you care, you have to resist consoling yourself by claiming it was ever thus, because it wasn’t. The future of Hollywood movies right now — at least, as it lives in the hands of five-year planners — feels somehow small and cautious, a dream dreamed by people whose sugarplum visions of profit maximization depend on the belief that things will never change.
Is that your dream? Think of how old you’ll be in 2020. Where will you be in your life? What will be different? Do you imagine that your taste will be exactly what it is today? Hollywood profoundly hopes the answer is yes. Your sameness is what it prays for. Because whatever you choose to make of the end of the movie, and of the movie year, it’s Birdman’s world now. We just live in it, sitting quietly and waiting to be sold Birdman 4.