By ALEX FUNG
February 26th, 2001As in previous years, I have collected several Top 10 lists from various film critics with enormous help from many generous people, and have attempted to consolidate this data into a 'consensus' ordered list of the top films of the year.
PREVIOUS RESULTS: 1999 (American Beauty) 1998 (Saving Private Ryan) 1997 (L.A. Confidential) 1996 (Fargo) 1995 (Sense And Sensibility) Anti-Top 10 Rant
In recent years, several film critics have penned columns railing against the proliferation of year-end top 10 lists, charging that such listmaking is an intrinsically lazy endeavour that's been effectively co-opted by studio publicists as a marketing tool; there's certainly much truth in these sentiments. Top 10 lists have become a notable hype device in selling films to the public -- consider the number of trailers or TV spots which boast that Film X has been named by Critics Y and Z as "One of the best films of the year!" or has been "Featured on 50 top 10 lists!" -- and, due to impeccable timing, a vital marketing tool for Academy Award campaigns; where possible, most For Your Consideration trade advertisements are quick to point out that a promoted film has been listed on multiple top 10 lists. Successfully campaigning for pictures named to Top 10 lists has attained such importance that it's a veritable cottage industry -- come November and December, critics are innundated with packages of screener tapes and DVDs of movies for possible top 10 list consideration, and field calls from insistent publicists petitioning them to include Film X in their year-end list. In short, top 10 listmaking has evolved from being a fun little exercise into an act which can have detectable ramifications on a film's performance, be it in the marketplace or with the Oscars, and consequently publicists now attempt to influence the process as much as possible. (As is their right; they can certainly ask, but it doesn't mean that critics have to acquiesce. Of course, some are more than willing to play ball, as in the case of a publicity-desperate group that was positively overjoyed at the opportunity to be mentioned in a studio FYC ad in exchange for top 10 cites; other critics can be decidedly less agreeable at the thought of being complicit with publicists -- the Chicago Reader's Jonathan Rosenbam once blasted the Chicago Film Critics Association for cooperating with publicist requests, while the Boston Phoenix's Gerald Peary wrote in late 2000, "The last time I looked, I was a film critic, which means I'm not supposed to be a PR recruit for any official movie publicity.")
Top 10 lists are arguably the antithesis of the entire concept of film criticism; there's no analysis involved, no insights provided -- it's simply a list of ordered (and sometimes even unordered) film titles, with the implication being that said pictures are a given critic's favourites of the year. Top 10 listmaking is at a par with grading movies with stars or a letter grade; it's nothing more than a shorthand (no wonder some call it laziness) and in no way serves as an adequate substitute for actual commentary on the films. It's difficult to be terribly mentally stimulated by a statement that Traffic is a critic's #3 film of the year, whereas an intelligent column dissecting the picture's themes and contextualizing the film's place within its genre can be infinitely more rewarding and meaningful.
So why do it? Why is there such a proliferation of top 10 lists? There are numerous reasons; I'll fire off three:
- As mentioned above, top 10 lists are a quick and easy shorthand -- an implicit approval and an implicit recommendation for the films listed. With reader attention spans shortening (note the modern proliferation of acronyms) and time and column space at a premium, it's no wonder that many publications favour the quick-and-dirty top 10 list format in favour of involved commentary. (If you're actually reading this and haven't already skipped down to the overall list yet, you're in the minority.)
- It's fun. Putting aside all the ramifications and implications inherent in selecting one film for a list in favour of another, it can be an enjoyable little exercise to reflect on the past year and sift through notable achievements. It's also fun to discover favourites amongst friends and colleagues; as Fort Worth Star-Telegram critic Christopher Kelly wrote (in a column which otherwise lambasted the top 10 process), "It's top-10-list season -- and I can't wait to find out which films my movie critic colleagues are going to cite as the year's best. Will I agree with them? Will I think they've lost their marbles? Will I learn of a movie or two that might have slipped beneath my radar?"
- Man is a creature with an intrinsic compulsion to seek order from chaos. To quantify a film, be it through a star rating, a letter grade, or a year-end ranking, is a symptom of this. The same holds true with other art forms which are equally inherently unquantifiable -- songs, records, books, and so forth.
As for the value of this particular exercise -- that is, attempting to consolidate several hundred top 10 lists into one so-called "consensus" list -- I'm not entirely certain. It's interesting to look at as a freakshow sort of stat, and I'm reasonably confident that there's a pretty good correlation between the consensus list and the critical reception received by the films, but I'd couldn't proclaim in good faith that this was a definitive list of The Best Films Of The Year -- merely the most critically-acclaimed. Nevertheless, I'm pleased that many seem to find this "Critical Consensus" exercise interesting and useful; I'm not entirely certain what the folks on the New York Times discussion board are using these results for, but the vast majority of feedback I've received pertaining to this endeavour indicate that it's being frequently used as a checklist for movies to see or rent. Keeping in mind that this isn't a definitive list -- do not treat it as holy writ -- I'm pleased that some may be experiencing films they wouldn't otherwise have seen as a result of this exercise, and hope that the list might serve as a springboard; if nothing else, I'd consider it to be a more useful viewer guide to discover worthwhile new cinema than the oft-published box office charts.
Scoring
I have again employed my proprietary scoring methodology this year, with a largely weighted scale for ordered Top 10 lists, and equal weighting for lists where no particular order was used. (For hybrid lists, I have compensated accordingly.) In ordered lists, only the first 10 films were counted -- if a critic deigned to list fifty pictures, I only took note of the first ten. "Tied" films were ranked equally in both ordered and unordered lists.
In instances where critics chose to recognize only a specific portion or sequence of a picture, I appropriately downgraded the points allotted to the film accordingly under the reasoning that it made no sense to assign the full point total when the critic only deemed a section of the picture to be 'listworthy'.
Several film critics filed multiple top ten lists in various publications, surveys and polls, many of which contained a different slate of films or different ordering thereof. In such conflicting instances, I have used the list derived from their affiliated periodical or publication.
As per last year, for the purposes of this compilation I have excluded the lists from all non-North American sources on the grounds that films which have been released internationally will clearly have an unfair advantage over pictures which have yet to bow in foreign territories. (Films yet to be released internationally will obviously have little-to-no chance to appear on any foreign Top 10 lists.)
This is not a perfect -- or even optimal -- system of measuring overall critical preferences by any stretch of the imagination; there are many glaring weaknesses with the methodology of extrapolating general critical favourites from top 10 lists.
Films that did not receive wide theatrical releases prior to the end of the year (which includes late-year platform releases, limited release pictures and quite literally all foreign-language films) are handicapped under this system as film critics in smaller markets will be unable to consider the pictures for their Top 10 lists either for practical -- they haven't seen them -- or ideological -- they won't list films unavailable to their readership -- grounds. (Some of the former can be offset by advance press screenings and the circulation of screener videos.) Simply put, virtually every critic has had the opportunity to see Gladiator and can consider whether or not to put it on their top 10 list; a decidedly smaller number could say the same about, say, Beau Travail or The Wind Will Carry Us or even something like The House Of Mirth. This should be considered when comparing the rankings of a huge wide-release studio picture like Erin Brockovich with a limited release arthouse picture which few have had the chance to see like Ratcatcher.
There's also the matter of the different sorts of film critics being surveyed -- while some film critics (mostly those based with smaller publications or with TV programmes, as well as some affiliated with large, very mainstream periodicals) stick to accessible Hollywood fare to populate their Top 10 lists (due either to editorial and/or readership constraints, or overall apathy towards cinema -- I continue to be aghast at reports of so-called critics who shun press screenings for arthouse films but pack-em-in for the latest Universal or Paramount or Fox offering), other film writers (frequently contributors to alternative publications) are willing (and able) to champion more adventurous fare -- small independent and experimental films, obscure foreign-language pictures, etc. This schism results in a rather schizophrenic final consolidated list which strikes me as both rather unsatisfying and unavoidable under this technique.
I'm also growing increasingly wary about my resolutely democratic weighting of each top 10 list. Each critic's list is presently weighed equally, but I'm unsure whether this is indeed appropriate; it's hard to dispute that some critics hold more influence than others, that some are better informed than others, and so on -- with all due respect, to give the same weight to a list from a prominent national film critic as to a reviewer for a local newspaper seems rather questionable. (And, of course, some critics are hardly "critics" at all.) One could make the argument that certain critics' lists should be given additional weight to take into account various factors, but the implementation of such a scheme would be tremendously problematic. (If, say, Owen Gleiberman is given a weighting of 3, then what's Lisa Schwarzbaum? A 3? 1.85? 4.53?)
My system is completely unbalanced as it measures the positive (that is, critics who are sufficiently passionate about a film to put it on their Top 10 lists) without taking into consideration the negative (critics who are either ambivalent or hostile towards the picture). Consequently, a film with a small group of ardent supporters will fare more successfully in this system than one which receives tepid appreciation; extremely divisive films will also score highly, as the degree of loathing directed at it by some critics is discarded altogether.
Ideally, rather than using Top 10 lists, the preferable methodology would to have all critics grade films on a uniform assessment system -- the 4-star rating system, or the out-of-10 technique -- and then compute the mean for each movie. This would then lessen the impact of bloc support and take into account the sentiments of critics indifferent (or worse) towards various pictures. However, even this methodology has significant handicaps. The same star rating can be interpreted differently by film critics; one critic's "***" assessment isn't necessarily equivalent to another's. Jonathan Rosenbaum, the fine critic of the Chicago Reader, corresponds a three star rating to a verbal equivalent of "a must-see"; for Roger Ebert, the well-known film critic of the Chicago Sun-Times, three stars approaches his average rating. Rosenbaum deems a two star-rated film "worth seeing"; for other critics, such a rating would be considered an out-and-out pan. It's quite clear that different critics utilising the same basic rating system can generate vastly different results based on their own tendencies.
This holds true even if critics are provided with guidelines regarding star rating equivalence so that some degree of uniformity can be achieved. I participate in an annual survey which in part solicits participants to assess every commerical release on the Maltin four-star scale and explicitly provides a sentiment-to-grade conversion scale ("1.0 = Utter crap"); nevertheless, there are distinct variations in voter tendencies. Given that most every critic's grades fall into a Gaussian distribution -- and they do -- I developed a "Toughness" statistic which accounts for undue influences (ie. a participant gives a disproproiate number of high star ratings simply because they exclusively see good movies) and effectively determines the delta between a voter's Gaussian peak and that of the norm. I was surprised to find that despite the provision of an explicit conversion scale, the overall spread between the stingiest and most generous voter exceeded a full star rating; that is, one man's "**" quite literally equals another's "***". Such significant variances would diminish the overall accuracy of using an average star rating system to rank overall critical assessments; still, it'd be a preferable method over making use of year-end Top 10 lists.
Nevertheless, this method is not without some merit; one would be hardpressed to claim that Crouching Tiger, Hidden Dragon, You Can Count On Me and Traffic were not largely well-received by the North American critical community. It's nearly irrefutable that the films near the top of the list received much critical acclaim, the films in the middle of the list less so, and so forth; objectively, there's a good correlation here. Whether the film ranking 27th was better received by critics than the movie ranking 28th (or 29th, or even 30th) is obviously debatable, but I presume it's understood that this degree of precision is unattainable: while the 1.409 point lead Before Night Falls holds over The Wind Will Carry Us is hardly definitive, I can state with a reasonable degree of confidence that the #15 film wasn't generally as well-received by North American film critics at large as the #5 picture, but was more-liked than the #25 and #35 movies.
As per the last couple of years, I've included "point totals" along with the rankings. While the figures themselves are largely incomprehensible, they should provide a vague idea of how films fared relative to each other for comparative purposes. The figure in square brackets next to the film title, where applicable, is the number of first-place votes the picture received.
The results for 2000:
[1] Crouching Tiger, Hidden Dragon [40½]
Ang Lee1835.677 [2] You Can Count On Me [11]
Kenneth Lonergan1215.741 [3] Traffic [15]
Steven Soderbergh1042.018 [4] Almost Famous [10½]
Cameron Crowe1001.371 [5] Wonder Boys [4]
Curtis Hanson767.712 [6] Dancer In The Dark [7]
Lars von Trier763.134 [7] Yi Yi (A One And A Two...) [14]
Edward Yang675.883 [8] Erin Brockovich [1]
Steven Soderbergh657.856 [9] Requiem For A Dream [3]
Darren Aronofsky627.036 [10] Chicken Run [2]
Peter Lord and Nick Park563.763 [11] Gladiator [2]
Ridley Scott550.400 [12] Beau Travail [11]
Claire Denis523.942 [13] Croupier [4]
Mike Hodges494.800 [14] High Fidelity
Stephen Frears483.053 [15] O Brother, Where Art Thou? [3½]
Joel Coen451.600 [16] The House Of Mirth [3]
Terence Davies446.700 [17] Billy Elliot [1]
Stephen Daldry375.912 [18] Nurse Betty [1]
Neil LaBute367.709 [19] Before Night Falls [3]
Julian Schnabel353.509 [20] The Wind Will Carry Us [4]
Abbas Kiarostami352.100 [21] The Virgin Suicides
Sofia Coppola320.909 [22] Quills [1]
Philip Kaufman313.766 [23] Cast Away [1]
Robert Zemeckis306.500 [24] The Color Of Paradise [3]
Majid Majidi268.944 [25] L'Humanité [1]
Bruno Dumont260.100 [26] Best In Show
Christopher Guest256.323 [27] Ghost Dog: The Way Of The Samurai
Jim Jarmusch245.478 [28] George Washington [2]
David Gordon Green243.342 [29] In The Mood For Love [1]
Wong Kar-wai242.300 [30] Jesus' Son
Alison Maclean233.851 [31] Hamlet [4]
Michael Almereyda215.600 [32] Chuck & Buck [1]
Miguel Arteta214.704 [33] Ratcatcher [1]
Lynne Ramsay202.200 [34] A Time For Drunken Horses [1]
Bahman Ghobadi171.709 [35] Not One Less
Zhang Yimou170.615 [36] Shadow Of The Vampire [1]
E. Elias Merhige166.700 [37] State And Main [1]
David Mamet162.900 [38] Girlfight [1]
Karyn Kusama161.695 [39] Girl On The Bridge [1]
Patrice Leconte159.044 [40] Thirteen Days
Roger Donaldson156.800 [41] Unbreakable [1]
M. Night Shyamalan154.767 [42] The Filth And The Fury
Julien Temple153.750 [43] Time Regained [1]
Raoul Ruiz148.320 [44] Bamboozled
Spike Lee141.251 [45] Time Code [1]
Mike Figgis125.900 [46] Wonderland [1]
Michael Winterbottom122.500 [47] Human Resources
Laurent Cantet122.342 [48] Pola X [1]
Leos Carax119.400 [49] Sunshine
István Szabó117.800 [50] After Life [1]
Hirokazu Kore-eda114.953 List generated with data as of 01/01/28 Commentary
- The race for the #1 position in this year's survey wasn't even close, with Ang Lee's Crouching Tiger, Hidden Dragon easily outpacing Kenneth Lonergan's You Can Count On Me for the crown. Crouching Tiger ranks first by any standard imaginable -- it scored the most "points" in my proprietary scoring system, placed on more Top 10 lists than any other picture by a wide margin, and, with forty-and-a-half first place votes, ranked tops with more North American-based critics than the next three films combined: in short, it was a complete blow-out. There's little doubt in my mind that Crouching Tiger, Hidden Dragon can be indisputably deemed The Most Critically-Acclaimed Film Of The Year.
(After some consideration, its success in the study isn't overly surprising; as an foreign-lingo arthouse-smash-turned-mainstream-phenomenon, it follows that the film could capture the fancy of both arty and mainstream film critics alike, and hence draw on support from both groups.) Crouching Tiger's #1 ranking enables director Ang Lee to be the first filmmaker in the history of this study to claim two #1 films -- his genteel Sense And Sensibility topped the inaugural (and admittedly-crude) annual list in 1995. (Lee's The Ice Storm placed fifth in 1997; he's the only director to score three top five appearances.)
- Crouching Tiger, Hidden Dragon is the first foreign-language film to place first in the annual rankings. The previous highest-ranking foreign-language film was Roberto Benigni's Life Is Beautiful, which ranked 7th in 1998.
- The race for the runner-up position was much more interesting, with Lonergan's directorial debut eventually beating out Steven Soderbergh's drug war drama Traffic and Cameron Crowe's wistful Almost Famous for the #2 position. There was a fairly substantial gap between Famous and the fifth-ranking film, Curtis Hanson's Wonder Boys.
- The strong performance of Edward Yang's drama Yi Yi (A One And A Two...) (#7) was quite remarkable given the film had only played a few select markets by the end of 2000. Winner of the year's National Society of Film Critics award for Best Picture, Yi Yi topped fourteen film critics' lists and conceivably could've given You Can Count On Me or even Crouching Tiger, Hidden Dragon a run for their money had it been seen by a wider base of film critics by year's end; even with its very limited theatrical release, it nearly managed to crack the top 5. This was the first year that two foreign-language films ranked in the top 10.
- The highly-divisive Lars von Trier picture Dancer In The Dark (#6) was a clear beneficiary of the methodology's bias of only measuring positive assessments while neglecting the negative; while the film has numerous supporters, it also has a substantial number of vocal detractors. I'm quite certain that a system which took into account the film's critics would find Dancer In The Dark placing significantly lower in the year's rankings.
- Only three films among the top 10-ranked pictures were theatrically released in the first half of the year -- Curtis Hanson's Wonder Boys (#4), Steven Soderbergh's Erin Brockovich (#8), and Peter Lord and Nick Park's Chicken Run (#10). Again, this gives credence either to the cliché that film critics, like Academy members, have short memories, or that studios release their stronger pictures in the second half of the year. I continue to subscribe to the latter philosophy.
- In addition to the strong performances by Crouching Tiger, Hidden Dragon and Yi Yi (A One And A Two...) in this year's survey, other films which ranked highly despite being rolled out in very limited or exclusive runs in December include Claire Denis' Beau Travail (#12, with eleven first-place rankings), the Coens' O Brother, Where Art Thou? (#15), Terence Davies' The House Of Mirth (#16), Julian Schnabel's Before Night Falls (#19), and Iranian master Abbas Kiarostami's The Wind Will Carry Us (#20). Do note that the availability of screening cassettes and DVDs can and occasionally do offset the impediment of limited releases in terms of giving film critics access to the film for the purposes of top 10 lists.
- I was somewhat surprised by Cast Away's (#23) soft showing in this year's survey -- for a year-end Oscar-bait prestige picture with Tom Hanks and Robert Zemeckis, one would tend to expect stronger support than that which it received.
- Although the perception exists that film critics tend to praise uncommercial and inaccessible material, two of the top 10 critical favourites -- Erin Brockovich (#8) and Chicken Run (#10) -- grossed over $100 million domestic, and as of this writing both Crouching Tiger, Hidden Dragon (#1) and Traffic (#3) seem likely to crack the century mark, bringing the total up to four. However, only two of the 2000's top 10 domestic box office grossers are represented on this year's Critical Consensus list -- Gladiator (#11) and Cast Away (#23).
- The films which ranked highest without topping any single critic's Top 10 list were Stephen Frears' High Fidelity (#14), Sofia Coppola's The Virgin Suicides (#21), and Christopher Guest's latest mockumentary Best In Show (#26).
- Veteran filmmakers dominated this year's survey, particularly at the top of the list: Ang Lee, Steven Soderbergh, Cameron Crowe, Curtis Hanson, and Lars von Trier have placed in previous surveys. The top first-time filmmakers this year included Kenneth Lonergan (You Can Count On Me [#2]), Stephen Daldry (Billy Elliot [#17]), and Sofia Coppola (The Virgin Suicides [#21]).
- XX count: 5. (Denis, Coppola, Maclean, Ramsay and Kusama)
- To absolutely nobody's surprise, Steven Soderbergh managed to place two films in the top fifty -- Traffic and Erin Brockovich. He's the first filmmaker to do so since Takeshi Kitano in 1998.
- After the animation renaissance of 1999, things returned to normal in 2000. Only Chicken Run (#10) managed to crack the top fifty. (The next closest was Dinosaur, way down at #127.)
- After a strong 1999, only one documentary was represented on the final list -- Julien Temple's Sex Pistols picture, The Filth And The Fury (#42). This is the worst showing for documentaries in this study since 1998, when S.R. Bindler's Hands On A Hardbody was the sole documentary to make the list (1998 #48).
- Fourteen of the top fifty critical favourites played theaters entirely in a language other than English. (This does not include Ghost Dog: The Way Of The Samurai, which features some French dialogue.) The top ranking foreign-language films were Crouching Tiger, Hidden Dragon (#1), Yi Yi (A One And A Two...) (#7), Beau Travail (#11), The Wind Will Carry Us (#20), and The Color Of Paradise (#24). Five of the fourteen pictures were of Asian origin; six were primarily in French.
- Due to its platform release, Claire Denis' Beau Travail (#12) also appeared on the 1999 list (#49 last year). Similarly, Hirokazu Kore-eda's After Life (a 2000 Canadian theatrical release) appeared on both lists (#50 in 2000, #30 in 1999). (God help us with the Amy Heckerling remake.)
- As in the case of After Life, Gary Burns' waydowntown almost made the Y2K Top 50 based on support from Canadian critics alone -- it wound up at #59. (Its U.S. theatrical run begins in 2001.)
- Wong Kar-wai's Tony Leung/Maggie Cheung vehicle In The Mood For Love (#29) placed amongst the top fifty critical faves exclusively due to support from critics who saw the film at a festival venue; it did not go into domestic theatrical distribution until 2001.
- There's a nice synchronicity about Shadow Of The Vampire and State And Main, two films preoccupied with the filmmaking process, ranking consecutively at #36 and #37, respectively.
- #51-55, for those who care: Judy Berlin, Magnolia (a 1999 platform release), Dark Days, An Affair Of Love, and The Claim.
- Curiously, Battlefield: Earth did not make any surveyed film critic's Top 10 list. (Has anyone else been getting a kick out of the hilarious Dateline Communications press releases in support of the film's DVD release?)
- Heading into 2000, the lowest-ranking Best Picture nominee over the duration of this study had been Good Will Hunting (1997 #24). However, the 1999 films which made up the Best Picture Oscar nominees were American Beauty (1999 #1), The Insider (#5), The Sixth Sense (#20), The Green Mile (#31), and The Cider House Rules (#38), allowing the Lasse Hallström picture to capture the dubious honor of lowest-ranking Best Picture nominee.
This mark was eclipsed again this year, as the 2000 Best Picture nominees were Crouching Tiger Hidden Dragon (2000 #1), Traffic (#3), Erin Brockovich (#8), Gladiator (#11), and Chocolat (unranked; #64). Curiously, the past three films to have held the record for lowest-ranking Best Picture nominees -- Chocolat, The Cider House Rules, and Good Will Hunting -- are all Miramax pictures.
- All six films to have ranked #1 over the duration of this annual survey have received Best Picture Oscar nominations, but only one of the five (this year's Oscars are outstanding as of this writing) has actually went on to win the award (American Beauty, 1999). At least two of the top five films have gone on to receive Best Picture Oscar nominations in every year of this study.
Alex Fung (aw220@freenet.carleton.ca).
Back to film page.