Take a seat, why dontcha!
On Prequels and Reboots: 2000-2010
INT. DAY. AN ATTIC.
Your HOST is going through stacks of old books desperately looking for new material while his friends talk about how they really wish there would be another Spiderman origin story.
This was not a great decade for the industry. There were some huge successes, but overall, a failing economy in America (and in the global sphere) led to audiences refusing to go to the movies, unless it was something they knew they’d like. Throw in the horrible tragedy of the 9/11 terrorist attacks and the war that followed, and you have people turning to superheroes and familiar action heroes the way Depression-Era viewers flocked to frilly escapist musicals.
It was a decade of stagnation. Some directors continued to innovate, but the public retreated to an ever-shrinking bubble of what they were willing to go out and see. Throw in the rise of streaming TV/movie services near the end of the decade, echoing the competition TV gave movies in the 50’s and 60’s, and you’ve got an unstable and Protean film landscape.
Nevertheless, you’ve got some fun stuff here.
CGI was still going strong, of course, and had advanced to the point where motion capture, a technique that turns physical performances into computer generated performances, added greater realism to the film landscape. Robert Zemeckis and James Cameron, whom we’ve met before, brought this technology to its greatest heights, but it was Peter Jackson who pioneered the technique.
“Smile, Legolas. This is going on our Christmas cards.”
The Lord of the Rings trilogy has long been fantasy’s greatest and most treasured work, but previous film adaptations were either low budget or animated. Peter Jackson’s sweeping film adaptation, which everyone expected to be a soulless remake in a sea of remakes, was a hit with audiences and critics. The Fellowship of the Ring (2001) combined as much practical elements as was, well, practical, and blended them with CGI environments and characters to bring Tolkien’s Middle Earth to life. Fellowship was the second-highest grossing film of the year, behind Harry Potter and the Philosopher’s Stone (which I’ll get to in a bit), and was nominated for thirteen Oscars, winning four.
The next film, The Two Towers (2002), introduced the world to what motion capture technology was capable of when it brought the tortured character of Gollum (played by Andy Serkis) to life. Towers was nominated for six Oscars and won two.
Gollum is sad that he didn’t get an Oscar.
The final film, Return of the King (2003), did the best of the bunch, winning all eleven Oscars for which it was nominated, including a rare Best Picture. win, which fantasy films had never managed before. Return still holds the record for biggest Oscar sweep.
Motion capture became the IT thing to do. Peter Jackson expanded it for his 2005 adaptation of King Kong wherein Andy Serkis donned the motion capture suit and brought the titular giant ape to life. The film is technically impressive, featuring some jaw-dropping action scenes, but sadly a bland cast (except for Serkis, of course) kept the film from making as big a splash as it could. It had a lot of potential, though. The scene with the T-Rexes is AMAZING.
Round 1: fight!
Robert Zemeckis, who’s always been a technical innovator, got in on the motion capture trend for three animated films. The Polar Express (2004) took computer generated animation beyond what even Pixar had managed up until that point (and Pixar was doing great at the time) and brought more nuance to its characters by basing their movements directly on the physical performances of the actors who all wore performance capture rigs. Tom Hanks, the unquestionable star of the film, plays six distinct characters.
Zemeckis’ next film was an animated adaptation of Beowulf (2007) that used performance capture to create a gritty brutal world that was absolutely NOTHING like the shiny gorgeousness of The Polar Express. I don’t think audiences and critics really got this one, which does take quite a bit of license with the original story, linking the dragon in the third part directly to the events with Grendel and his mother in the first two sections as opposed to it just being a separate challenge to overcome. I really REALLY like the visual style of this film, and the actors are all amazing, but the pacing is a bit slow and I always find myself falling asleep near the third act. One of these days I need to watch the whole thing because it really features some of the best animation of the decade.
That’s not something you want to wake up to in the morning…
His last film of the decade returned to the feel of The Polar Express, which had made a bigger splash with audiences than Beowulf. A Christmas Carol (2009) again used motion capture to translate Jim Carrey’s multiple performances (he played Scrooge and all three of the Christmas ghosts) onto the screen. This one’s really good, but it’s faithfulness to the original story’s dark tone (especially the third ghost) made it a bit too scary for young children and some critics (but what do critics know, anyways?).
Jackson and Zemeckis both expanded the limits of animated characters by allowing actors to physically create their character’s performance instead of just being a voice in a soundstage. Disney had done a form of this in its early days where they would have actors in costume act out certain scenes and then the animators would copy those actions on the page. In some cases, they could even rotoscope animated drawings directly over physical performances to give them greater realism, but with motion capture, CGI characters could get the Academy of Arts and Sciences to begin questioning whether or not acting awards could be given to actors who brought a character to life via motion capture (I still think Andy Serkis deserves about twenty Oscars for Gollum).
But motion capture reached its peak with Avatar (2009). While the performances of the actual actors were less impressive than in Lord of the Rings or The Polar Express in terms of their stylistic uniqueness, James Cameron’s ability to create nearly photorealistic CGI characters via performance capture is incredible. After Titanic, he began working on this, but the technology didn’t exist yet. It took a decade or so for Cameron to invent and adapt the technology necessary to bring his vision to life. His work paid off. Avatar is still the highest-grossing film of all time. Its story is familiar and its plot is pretty slight, but it’s such a COOL world to muck about in that audiences flocked to it repeatedly. This film also heralded the resurgence of 3D film. It had been creeping back a few years prior, but Avatar was a film that BEGGED to be seen in 3D.
“You mean, other movies have had the same plot as ours?!”
After Avatar, many films were upconverted to 3D with uneven success, and then a slew of films were re-released in 3D to capitalize on the trend, again with mixed success. The Lion King and Jurassic Park were AMAZING in 3D, so I guess I’m happy for silly trends. 3D films are still around, but they’re not really the must-see thing. They drive box office profits up but I don’t think they’re going to change theater-going until the technology can be made more practical.
Avatar was an oddball because it was an original story (ish. I mean it was basically a Pocahontas/Dances with Wolves mashup) but the biggest hits of the decade were all remakes, sequels, or reboots.
The Star wars prequels, of course, were a game-changing part of the decade, making huge technical leaps while sparking debates and criticism about how those technical achievements were overshadowing the spirit of the franchise itself. Interestingly enough, Attack of the Clones (2002) was the first film shot entirely with digital instead of physical film. It’s funny that so much of the criticism leveled against the films were for an over-use of CGI since Avatar would later prove that it wasn’t CGI that bothered people. Realizing that cherished franchises, which were being brought out the woodwork throughout the decade, would never be able to have the same specific aesthetic that they had in the seventies and eighties bothered a lot of people. CGI-heavy films like King Kong could be big hits because their source material was far enough away that updated effects seemed necessary (after all, the original King Kong was a combination of dude-in-a-suit and claymation) but when it came to more recent franchises like Star Wars and Indiana Jones, the idea of mucking about with them and either telling more of the story or getting into backstory made people really nervous.
Things changed in the second half of the decade when the “gritty reboot” came into fashion. The origin for this, can actually be traced back to the television reboot of Battlestar Galactica in 2004. It took a campy TV show from the seventies and turned it into a dark brooding drama that got a lot of people hooked. The next year, the Batman franchise restarted with Christopher Nolan’s incredible Batman Begins which abandoned all of the series’ campy and fantasy elements and created a relateable commentary on the economic collapse of the Recession that turned Batman from an idealized billionaire fighting crime into a champion for people fighting poverty and corruption.
“Don’t call it the batmobile! That’s lame! It’s “The Tumbler” now.”
Next up, the long-running James Bond franchise started everything from square one with Casino Royale, a brutal action thriller which abandoned the gadgets and catchphrases of its campier past and re-embraced the grittier realism of earlier films like From Russia with Love. It was a big it with fans and critics and introduced Daniel Craig to the world as a 007 learning that he can’t trust anyone except himself. The series continued to drift back into more fun territory in later years, but during this time in history, this is the James Bond people wanted.
After that, Star Trek was rebooted in film form in 2009 with a new roster of actors taking on the iconic roles of the series 1960’s origins. Despite criticisms (many of them spot on) that the film completely abandoned the spirit of Trek in favor of crowd-pleasing action, the film brought the franchise back into the public eye since the cancellation of prequel series Enterprise in 2004.
This is also the decade of the superhero. In the 90’s, superhero films fizzled out with Batman and Robin, but in 2000, X-Men, directed by Bryan Singer and starring Hugh Jackman in his first Hollywood role, proved that superhero films could be a thing again.
In 2002, Spider-Man, directed by Sam Raimi and starring Tobey Maguire and Kirsten Dunst, gained acceptance and appreciation. Fans loved it, critics loved it (it won two Oscars in technical categories), and it set off a chain reaction that’s still unwinding today.
Strike a pose!
Ang Lee’s more cerebral Jekyll & Hyde-style Hulk (2003) wasn’t as big a hit, but it did signal what was possible with superhero films. The Spider-Man franchise continued strong with Spider-Man 2 in 2004 (we won’t talk about Spider-Man 3), and gave studios the confidence to start another massive franchise, the Marvel Cinematic Universe, beginning with Iron Man in 2008. The Avengers concept was a big gamble, with multiple film franchises all connecting together into a larger franchise that shared continuity with all of them, but the gamble paid off. Marvel’s success led to their being bought by Disney who oversaw distribution of their films.
It’s easy to get distracted by all the massive franchises that dominated this decade. Harry Potter, Star Wars, Batman, X-Men, Lord of the Rings, Spider-Man, and others sort of ate up all the media buzz, and it took other films a lot to break through.
One that bears mentioning is Brokeback Mountain (2005), a film that wrestled its way into the front of the line by sheer controversy. It’s central tragic love story would not have caused such a ruckus had it not been between two men. Not only that, but they were stereotypically masculine fellas, which freaked everyone out. Directed by Ang Lee and starring Jake Gyllenhall and Heath Ledger, the film was nominated for eight Oscars, winning three (though why it lost Best Picture to Crash is beyond me). It’s funny, today, the film, when brought up in conversation, STILL evokes uncomfortable mutterings about “that gay cowboy movie” even though gay marriage is now legal in the United States and LGBTQ visibility in media is increasing with each year. This movie made America so violently uncomfortable that it will probably be another ten years before folks can more openly appreciate what a good movie it is.
That horse doesn’t care that those two gents love each other. Be like that horse.
Another big hit during this decade was Gladiator (2000), Ridley Scott’s Roman epic starring Rusell Crowe. Big budget historical epics didn’t generate quite the same buzz as they had in the 90’s, but this one snagged a Best Picture win as well as four other Oscars. The Big Historical Epic would fall out of fashion after this, but Russell Crowe would enjoy a run of amazing success after this, getting nominated for another Oscar the following year with A Beautiful Mind in 2001.
There’s a LOT more I could talk about in this decade, but I HAVE to mention Lee Daniels’ incredible career during this decade, producing the Oscar-winning Monster’s Ball and directing the heart-rendingly powerful Precious (2009) which got six Oscar nominations. Incredibly, this is the first Best Picture nominee to be directed by an African-American filmmaker. Geoffery Fletcher’s script won one for Best Adapted Screenplay (making him the first African-American to win in that category), and actress/singer Mo’Nique won best Supporting Actress for her terrifying portrayal of an abusive mother, receiving a standing ovation when she stepped up to accept the award.
This movie is pretty intense, but worth watching.
And that brings us to the end of Cinema. Looking to the future, the current decade is basically repeating everything the 2000s did. I hope something comes along soon that shakes everything up. If this blog is still going in 2020, I’ll definitely do another episode.
You are all amazing.
Dirks, Tim. “Film History of the 2000’s”