Alex Fung's Page > Film Columns/Essays > 2000's Top Films: A Critical Consensus

2000's Top Films: A Critical Consensus

By ALEX FUNG
February 26th, 2001

PREVIOUS RESULTS:
  • 1999 (American Beauty)
  • 1998 (Saving Private Ryan)
  • 1997 (L.A. Confidential)
  • 1996 (Fargo)
  • 1995 (Sense And Sensibility)
  • As in previous years, I have collected several Top 10 lists from various film critics with enormous help from many generous people, and have attempted to consolidate this data into a 'consensus' ordered list of the top films of the year.

    Anti-Top 10 Rant

    In recent years, several film critics have penned columns railing against the proliferation of year-end top 10 lists, charging that such listmaking is an intrinsically lazy endeavour that's been effectively co-opted by studio publicists as a marketing tool; there's certainly much truth in these sentiments. Top 10 lists have become a notable hype device in selling films to the public -- consider the number of trailers or TV spots which boast that Film X has been named by Critics Y and Z as "One of the best films of the year!" or has been "Featured on 50 top 10 lists!" -- and, due to impeccable timing, a vital marketing tool for Academy Award campaigns; where possible, most For Your Consideration trade advertisements are quick to point out that a promoted film has been listed on multiple top 10 lists. Successfully campaigning for pictures named to Top 10 lists has attained such importance that it's a veritable cottage industry -- come November and December, critics are innundated with packages of screener tapes and DVDs of movies for possible top 10 list consideration, and field calls from insistent publicists petitioning them to include Film X in their year-end list. In short, top 10 listmaking has evolved from being a fun little exercise into an act which can have detectable ramifications on a film's performance, be it in the marketplace or with the Oscars, and consequently publicists now attempt to influence the process as much as possible. (As is their right; they can certainly ask, but it doesn't mean that critics have to acquiesce. Of course, some are more than willing to play ball, as in the case of a publicity-desperate group that was positively overjoyed at the opportunity to be mentioned in a studio FYC ad in exchange for top 10 cites; other critics can be decidedly less agreeable at the thought of being complicit with publicists -- the Chicago Reader's Jonathan Rosenbam once blasted the Chicago Film Critics Association for cooperating with publicist requests, while the Boston Phoenix's Gerald Peary wrote in late 2000, "The last time I looked, I was a film critic, which means I'm not supposed to be a PR recruit for any official movie publicity.")

    Top 10 lists are arguably the antithesis of the entire concept of film criticism; there's no analysis involved, no insights provided -- it's simply a list of ordered (and sometimes even unordered) film titles, with the implication being that said pictures are a given critic's favourites of the year. Top 10 listmaking is at a par with grading movies with stars or a letter grade; it's nothing more than a shorthand (no wonder some call it laziness) and in no way serves as an adequate substitute for actual commentary on the films. It's difficult to be terribly mentally stimulated by a statement that Traffic is a critic's #3 film of the year, whereas an intelligent column dissecting the picture's themes and contextualizing the film's place within its genre can be infinitely more rewarding and meaningful.

    So why do it? Why is there such a proliferation of top 10 lists? There are numerous reasons; I'll fire off three:

    1. As mentioned above, top 10 lists are a quick and easy shorthand -- an implicit approval and an implicit recommendation for the films listed. With reader attention spans shortening (note the modern proliferation of acronyms) and time and column space at a premium, it's no wonder that many publications favour the quick-and-dirty top 10 list format in favour of involved commentary. (If you're actually reading this and haven't already skipped down to the overall list yet, you're in the minority.)

    2. It's fun. Putting aside all the ramifications and implications inherent in selecting one film for a list in favour of another, it can be an enjoyable little exercise to reflect on the past year and sift through notable achievements. It's also fun to discover favourites amongst friends and colleagues; as Fort Worth Star-Telegram critic Christopher Kelly wrote (in a column which otherwise lambasted the top 10 process), "It's top-10-list season -- and I can't wait to find out which films my movie critic colleagues are going to cite as the year's best. Will I agree with them? Will I think they've lost their marbles? Will I learn of a movie or two that might have slipped beneath my radar?"

    3. Man is a creature with an intrinsic compulsion to seek order from chaos. To quantify a film, be it through a star rating, a letter grade, or a year-end ranking, is a symptom of this. The same holds true with other art forms which are equally inherently unquantifiable -- songs, records, books, and so forth.

    As for the value of this particular exercise -- that is, attempting to consolidate several hundred top 10 lists into one so-called "consensus" list -- I'm not entirely certain. It's interesting to look at as a freakshow sort of stat, and I'm reasonably confident that there's a pretty good correlation between the consensus list and the critical reception received by the films, but I'd couldn't proclaim in good faith that this was a definitive list of The Best Films Of The Year -- merely the most critically-acclaimed. Nevertheless, I'm pleased that many seem to find this "Critical Consensus" exercise interesting and useful; I'm not entirely certain what the folks on the New York Times discussion board are using these results for, but the vast majority of feedback I've received pertaining to this endeavour indicate that it's being frequently used as a checklist for movies to see or rent. Keeping in mind that this isn't a definitive list -- do not treat it as holy writ -- I'm pleased that some may be experiencing films they wouldn't otherwise have seen as a result of this exercise, and hope that the list might serve as a springboard; if nothing else, I'd consider it to be a more useful viewer guide to discover worthwhile new cinema than the oft-published box office charts.

    Scoring

    I have again employed my proprietary scoring methodology this year, with a largely weighted scale for ordered Top 10 lists, and equal weighting for lists where no particular order was used. (For hybrid lists, I have compensated accordingly.) In ordered lists, only the first 10 films were counted -- if a critic deigned to list fifty pictures, I only took note of the first ten. "Tied" films were ranked equally in both ordered and unordered lists.

    In instances where critics chose to recognize only a specific portion or sequence of a picture, I appropriately downgraded the points allotted to the film accordingly under the reasoning that it made no sense to assign the full point total when the critic only deemed a section of the picture to be 'listworthy'.

    Several film critics filed multiple top ten lists in various publications, surveys and polls, many of which contained a different slate of films or different ordering thereof. In such conflicting instances, I have used the list derived from their affiliated periodical or publication.

    As per last year, for the purposes of this compilation I have excluded the lists from all non-North American sources on the grounds that films which have been released internationally will clearly have an unfair advantage over pictures which have yet to bow in foreign territories. (Films yet to be released internationally will obviously have little-to-no chance to appear on any foreign Top 10 lists.)

    This is not a perfect -- or even optimal -- system of measuring overall critical preferences by any stretch of the imagination; there are many glaring weaknesses with the methodology of extrapolating general critical favourites from top 10 lists.

    Films that did not receive wide theatrical releases prior to the end of the year (which includes late-year platform releases, limited release pictures and quite literally all foreign-language films) are handicapped under this system as film critics in smaller markets will be unable to consider the pictures for their Top 10 lists either for practical -- they haven't seen them -- or ideological -- they won't list films unavailable to their readership -- grounds. (Some of the former can be offset by advance press screenings and the circulation of screener videos.) Simply put, virtually every critic has had the opportunity to see Gladiator and can consider whether or not to put it on their top 10 list; a decidedly smaller number could say the same about, say, Beau Travail or The Wind Will Carry Us or even something like The House Of Mirth. This should be considered when comparing the rankings of a huge wide-release studio picture like Erin Brockovich with a limited release arthouse picture which few have had the chance to see like Ratcatcher.

    There's also the matter of the different sorts of film critics being surveyed -- while some film critics (mostly those based with smaller publications or with TV programmes, as well as some affiliated with large, very mainstream periodicals) stick to accessible Hollywood fare to populate their Top 10 lists (due either to editorial and/or readership constraints, or overall apathy towards cinema -- I continue to be aghast at reports of so-called critics who shun press screenings for arthouse films but pack-em-in for the latest Universal or Paramount or Fox offering), other film writers (frequently contributors to alternative publications) are willing (and able) to champion more adventurous fare -- small independent and experimental films, obscure foreign-language pictures, etc. This schism results in a rather schizophrenic final consolidated list which strikes me as both rather unsatisfying and unavoidable under this technique.

    I'm also growing increasingly wary about my resolutely democratic weighting of each top 10 list. Each critic's list is presently weighed equally, but I'm unsure whether this is indeed appropriate; it's hard to dispute that some critics hold more influence than others, that some are better informed than others, and so on -- with all due respect, to give the same weight to a list from a prominent national film critic as to a reviewer for a local newspaper seems rather questionable. (And, of course, some critics are hardly "critics" at all.) One could make the argument that certain critics' lists should be given additional weight to take into account various factors, but the implementation of such a scheme would be tremendously problematic. (If, say, Owen Gleiberman is given a weighting of 3, then what's Lisa Schwarzbaum? A 3? 1.85? 4.53?)

    My system is completely unbalanced as it measures the positive (that is, critics who are sufficiently passionate about a film to put it on their Top 10 lists) without taking into consideration the negative (critics who are either ambivalent or hostile towards the picture). Consequently, a film with a small group of ardent supporters will fare more successfully in this system than one which receives tepid appreciation; extremely divisive films will also score highly, as the degree of loathing directed at it by some critics is discarded altogether.

    Ideally, rather than using Top 10 lists, the preferable methodology would to have all critics grade films on a uniform assessment system -- the 4-star rating system, or the out-of-10 technique -- and then compute the mean for each movie. This would then lessen the impact of bloc support and take into account the sentiments of critics indifferent (or worse) towards various pictures. However, even this methodology has significant handicaps. The same star rating can be interpreted differently by film critics; one critic's "***" assessment isn't necessarily equivalent to another's. Jonathan Rosenbaum, the fine critic of the Chicago Reader, corresponds a three star rating to a verbal equivalent of "a must-see"; for Roger Ebert, the well-known film critic of the Chicago Sun-Times, three stars approaches his average rating. Rosenbaum deems a two star-rated film "worth seeing"; for other critics, such a rating would be considered an out-and-out pan. It's quite clear that different critics utilising the same basic rating system can generate vastly different results based on their own tendencies.

    This holds true even if critics are provided with guidelines regarding star rating equivalence so that some degree of uniformity can be achieved. I participate in an annual survey which in part solicits participants to assess every commerical release on the Maltin four-star scale and explicitly provides a sentiment-to-grade conversion scale ("1.0 = Utter crap"); nevertheless, there are distinct variations in voter tendencies. Given that most every critic's grades fall into a Gaussian distribution -- and they do -- I developed a "Toughness" statistic which accounts for undue influences (ie. a participant gives a disproproiate number of high star ratings simply because they exclusively see good movies) and effectively determines the delta between a voter's Gaussian peak and that of the norm. I was surprised to find that despite the provision of an explicit conversion scale, the overall spread between the stingiest and most generous voter exceeded a full star rating; that is, one man's "**" quite literally equals another's "***". Such significant variances would diminish the overall accuracy of using an average star rating system to rank overall critical assessments; still, it'd be a preferable method over making use of year-end Top 10 lists.

    Nevertheless, this method is not without some merit; one would be hardpressed to claim that Crouching Tiger, Hidden Dragon, You Can Count On Me and Traffic were not largely well-received by the North American critical community. It's nearly irrefutable that the films near the top of the list received much critical acclaim, the films in the middle of the list less so, and so forth; objectively, there's a good correlation here. Whether the film ranking 27th was better received by critics than the movie ranking 28th (or 29th, or even 30th) is obviously debatable, but I presume it's understood that this degree of precision is unattainable: while the 1.409 point lead Before Night Falls holds over The Wind Will Carry Us is hardly definitive, I can state with a reasonable degree of confidence that the #15 film wasn't generally as well-received by North American film critics at large as the #5 picture, but was more-liked than the #25 and #35 movies.

    As per the last couple of years, I've included "point totals" along with the rankings. While the figures themselves are largely incomprehensible, they should provide a vague idea of how films fared relative to each other for comparative purposes. The figure in square brackets next to the film title, where applicable, is the number of first-place votes the picture received.

    The results for 2000:

    [1]Crouching Tiger, Hidden Dragon [40½]
    Ang Lee
    1835.677
    [2]You Can Count On Me [11]
    Kenneth Lonergan
    1215.741
    [3]Traffic [15]
    Steven Soderbergh
    1042.018
    [4]Almost Famous [10½]
    Cameron Crowe
    1001.371
    [5]Wonder Boys [4]
    Curtis Hanson
    767.712
    [6]Dancer In The Dark [7]
    Lars von Trier
    763.134
    [7]Yi Yi (A One And A Two...) [14]
    Edward Yang
    675.883
    [8]Erin Brockovich [1]
    Steven Soderbergh
    657.856
    [9]Requiem For A Dream [3]
    Darren Aronofsky
    627.036
    [10]Chicken Run [2]
    Peter Lord and Nick Park
    563.763
    [11]Gladiator [2]
    Ridley Scott
    550.400
    [12]Beau Travail [11]
    Claire Denis
    523.942
    [13]Croupier [4]
    Mike Hodges
    494.800
    [14]High Fidelity
    Stephen Frears
    483.053
    [15]O Brother, Where Art Thou? [3½]
    Joel Coen
    451.600
    [16]The House Of Mirth [3]
    Terence Davies
    446.700
    [17]Billy Elliot [1]
    Stephen Daldry
    375.912
    [18]Nurse Betty [1]
    Neil LaBute
    367.709
    [19]Before Night Falls [3]
    Julian Schnabel
    353.509
    [20]The Wind Will Carry Us [4]
    Abbas Kiarostami
    352.100
    [21]The Virgin Suicides
    Sofia Coppola
    320.909
    [22]Quills [1]
    Philip Kaufman
    313.766
    [23] Cast Away [1]
    Robert Zemeckis
    306.500
    [24]The Color Of Paradise [3]
    Majid Majidi
    268.944
    [25]L'Humanité [1]
    Bruno Dumont
    260.100
    [26]Best In Show
    Christopher Guest
    256.323
    [27]Ghost Dog: The Way Of The Samurai
    Jim Jarmusch
    245.478
    [28]George Washington [2]
    David Gordon Green
    243.342
    [29]In The Mood For Love [1]
    Wong Kar-wai
    242.300
    [30]Jesus' Son
    Alison Maclean
    233.851
    [31]Hamlet [4]
    Michael Almereyda
    215.600
    [32]Chuck & Buck [1]
    Miguel Arteta
    214.704
    [33]Ratcatcher [1]
    Lynne Ramsay
    202.200
    [34]A Time For Drunken Horses [1]
    Bahman Ghobadi
    171.709
    [35]Not One Less
    Zhang Yimou
    170.615
    [36]Shadow Of The Vampire [1]
    E. Elias Merhige
    166.700
    [37]State And Main [1]
    David Mamet
    162.900
    [38]Girlfight [1]
    Karyn Kusama
    161.695
    [39]Girl On The Bridge [1]
    Patrice Leconte
    159.044
    [40]Thirteen Days
    Roger Donaldson
    156.800
    [41]Unbreakable [1]
    M. Night Shyamalan
    154.767
    [42]The Filth And The Fury
    Julien Temple
    153.750
    [43]Time Regained [1]
    Raoul Ruiz
    148.320
    [44]Bamboozled
    Spike Lee
    141.251
    [45]Time Code [1]
    Mike Figgis
    125.900
    [46]Wonderland [1]
    Michael Winterbottom
    122.500
    [47]Human Resources
    Laurent Cantet
    122.342
    [48]Pola X [1]
    Leos Carax
    119.400
    [49]Sunshine
    István Szabó
    117.800
    [50]After Life [1]
    Hirokazu Kore-eda
    114.953
    List generated with data as of 01/01/28

    Commentary


    Alex Fung (aw220@freenet.carleton.ca).

    Back to film page. 1