Is it Any Good? The Scourge of Rotten Tomatoes and Metacritic

You may have noticed that here on Plot and Theme I never attach a grade to my reviews. Distilling an entire film into a single number or letter has always rubbed me the wrong way, as it inherently removes any critical nuance from the discourse. But, I am aware that most reviews do provide a grade in summation, and these can help gauge the overall quality of the film. More recently, with the rise of review aggregators like Rotten Tomatoes and Metacritic, scores of these reviews are condensed down into a single number. The result is a peculiar derivative of a derivative – thoughts and words transformed into a number, then that number lost in a sea of others. The purpose of this piece is to explain that process in more detail, and ultimately determining if any of these procedures result in answering a simple question: Is <Insert Film > any good?

Let’s get this out of the way from the get-go: any grade, score, or full-fledged review of a movie cannot compare to seeing it for yourself. Well-reviewed movies can leave you unimpressed, and critically-panned ones can land right in your wheelhouse. This is not a discussion on the merits of movie critics or reviews as a whole. People like (or hate!) reading reviews for different reasons, but suffice to say that I think critics can play an important role in evaluating the merit of a creative work. This piece is not about any of that – it is about aggregating multiple reviews into a digestible number for the purposes of “summing up” the quality of a movie.

Why is this even an issue to discuss? Rotten Tomatoes scores and the like are becoming the de-facto way to package a movie for public consumption. I have seen “98% on Rotten Tomatoes!!!” used in TV spots in the past year, and have many friends who make their movie choices based on the score on the Tomatometer. This is especially true for smaller films that will never be able to boast “#1 movie in the nation” after a top-grossing opening weekend. Furthermore, Wikipedia pages for all films have both the Rotten Tomatoes and the Metacritic listed prominently (usually in the “Reception” section).  Moreover, Google results for a film post these same metrics at the very top of the results.  At the very least, it is important to understand what these scores mean.

So, what goes into the Tomatometer? According to the website, “ The Tomatometer™ rating represents the percentage of professional critic reviews that are positive for a given film or television show.” Essentially, Rotten Tomatoes curates a pool of critics who review films, either in print or online. Some of the more established critics get labeled a “top critic”, which doesn’t really factor into the final calculation, but when you look at individual reviews you can see who those critics are. Each review is characterized as “positive” or “negative”, based either on the actual grade given in a review, or a general understanding of the review. It appears as though a film has to be better than 6/10 to be counted as “positive”. For example, scores of 3/5 or C- on an A-F scale are counted as “negative” reviews for The Revenant, which you can see here. If 60% or more of the total reviews are “positive”, then the movie is given a “fresh” rating. Otherwise, it is given a “rotten” rating. There are some refinements which lead to “certified fresh”, but these are not important to the discussion at hand.

This is a far simpler system than the one used by Metacritic. Like Rotten Tomatoes, Metacritic takes a number of professional reviews, but instead of simply classifying them as “positive” or “negative”, assigns an actual score to the review on a 100-point scale. On the Metacritic website, you can see some of the conversions that they do to change a C+ into a 58/100, or 3.5 out of 4 stars into an 88/100, for example. Then, with all of these converted scores, Metacritic averages them together, assigning greater weight to critics and publications with higher quality. Hence, the score given by A.O. Scott of the New York Times counts for slightly more in the resulting average than Jimmy the Critic on his blog. This is why Metacritic refers to their final score as a “weighted average”.

Right away, we can see that these two scores, both of which are expressed as X / 100, actually measure completely different things. The RT score is basically, “what percentage of reviews are positive” (whatever that means), whereas the MC score is more of an average rating. It doesn’t take a great deal of imagination to see how these two systems could fail a movie fan. For the sake of argument, let’s look at two movies, which we well call Movie A and Movie B (still better titles than “Attack of the Clones”). Movie A receives 100 reviews on both RT and MC, half of which are 6/10 and half of which are 7/10. Movie B also receives 100 reviews, but they are all 6.5/10. It should be evident to all of you maths aficionados out there that Movie A and B both receive a MC score of 65 since that is how averages work. But, over at RT, Movie A ends up with 50% on the Tomatometer and Movie B ends up at 100%. Wait, what?

See, as far as RT is concerned, all of the reviews for Movie B are “positive”, so it is happy to grant the 100%. By contrast, half of the reviews for Movie A are only 6/10, which is “negative”, so this film would earn the “Rotten” moniker comfortably. Obviously things are never this clear-cut in the real world, but this particular example should give you pause. It is clear that the Rotten Tomatoes system should not be viewed as a quantitative system, but as a qualitative one. When two different above-average films end up with wildly different scores, it becomes difficult to use the Tomatometer as anything but a general guide: X% of the critics thought this movie was above-average.

Recognize also that this kind of system rewards mediocre films. This should also be clear with another simple example: Movie C receives 100 reviews as before, but all of them are 6.5/10, whereas Movie D is one of the best films ever made, as every one of the 100 critics gives it a 9.5/100 grade. Rotten Tomatoes has the exact same grade for both of these movies: 100%. Perhaps the best way to sum this all up: the Rotten Tomatoes score cannot tell you how good a movie is, only if most people think it is good or not.

For more information on that how good, we need to turn to a more robust scale like Metacritic. Since this metric is essentially an average, we can be quite confident that the higher the MC score, the more universally-beloved the film is. If there is a weakness to this system, it is that it artificially lowers the score of a film, because lower scores weigh down the average. This is why the people at Metacritic claim that any film which scores above 80% should be considered “Universally Acclaimed”, and films between 61 and 80 are characterized as “Generally Favorable”.

But, we have to be careful about both of these systems for a simple reason: they are both in bed with particular production companies! Metacritic is owned by CBS, which is owned by National Amusements, which owns Paramount. Paramount Pictures produces a number of films like Terminator Genisys and Mission: Impossible – Rogue Nation, and it benefits Paramount if Metacritic praises films like these. Similarly, Rotten Tomatoes is owned by Flixster, which is owned by Time Warner, which owns Warner Bros. Hence, higher RT scores is good for Time Warner movies.

So what’s the answer for those out there who are trying to decide what movie to see this weekend? On the surface level, you can get a good sense of how critics generally feel with the RT score.   A movie mired in the low teens has a pretty good chance of being not-so-good, but may have some redeeming qualities that get lost in the aggregation process. By contrast, even a movie in the high 80s could have some flaws, and you’ll probably not be able to use Rotten Tomatoes to decide between an 87% and a 95% – too much of the nuance has gotten lost. Hopping over to Metacritic may help slightly, but you’ll likely be confronted with a similar problem thanks to the inherent nature of averages: it may be hard to tell if you would enjoy the film with a score of 75 or the one with a score of 68. And the reason is simple: these review aggregators normalize out any and all specifics, until you are reacting to nothing more than a number.

A number cannot explain the majesty of 2001: A Space Odyssey or the horror of Alien. It will never tell you just how funny Ghostbusters is, nor express the quotability of The Big Lebowski. As lovers of movies and rational consumers, we do a great disservice to ourselves by trying to distill a film into a nice, easy, numerical summation. Instead, get out there and read reviews from critics who intrigue you (note: not necessarily those you agree with!). Watch youtube videos where the creators explain specifics moments in the film and why those affected them so.

For in the end, Rotten Tomatoes and Metacritic and IMDB user grades and anything else which seeks to establish a “consensus” grade for the quality of a film is contrary to the process of how humans actually experience a film. One does not jot each moment and aspect of a film into a great ledger, add up the red and black ink and come to an encompassing conclusion. Our jaws drop and our hearts skip at moments, and our eyes roll when they are unearned or poorly executed. Find critics who convey which films possess the moments you enjoy most, and then go support those films.


(Note:  originally, the third paragraph of this piece claimed that the Wikipedia page of a film listed these scores near the top of the page.  This is not true, and I meant that these scores were reproduced next to the Wikipedia entry on the Google search results, but clearly failed to make this point clear.  Thanks to /u/CuddlePirate420 on reddit and Scott Keith in the comments section for pointing out this mistake.  I have since corrected the paragraph to its current form.)

7 thoughts on “Is it Any Good? The Scourge of Rotten Tomatoes and Metacritic”

    • Thanks for catching this! I meant that Google results show these scores alongside the wikipedia page (and the official site if the movie has one of those). I must have just combined the two in my head.

      Again, thanks for the correction, and I have edited the post to reflect what I was actually thinking.

      Reply
  1. Derek, have you any clue why IMDB scores never got as popular as RT and MC? I for one always used them as they seem more accurate from my experience with films. Is IMDB also swayed by some big company? As far as I’m aware they work purely on user rating and averaging that, but perhaps I’m wrong.

    Reply
    • No, you’re exactly right, IMDB is exclusively user rating.

      On the one hand, this is good, because it democratizes the rating.

      But, in reality the result is a bunch of Fanboys religiously upvoting a flick, or haters down voting one. The simple fact that a person can register a vote on IMDB without ever having seen the film, anonymously, has really hindered it’s acceptance from most film fans.

      Reply
      • Thanks, I guess mass fanboy voting or hate voting is an issue, but it should overall balance itself I imagine.

        Great blog, keep up the good work 🙂

        Reply
  2. Both Rotten Tomatoes and Metacritic are run by incompetent slobs who have no understanding of stats, averages or basic math in general not to mention reviews themselves.
    Besides their complete misinterpretation of reviews in attributing scores to them (Metacritic is so stupid they give 100/100 regularly and 0 out of 0 to reviewers for most films, which on a scale of 100 is absurd and basically impossible as there is no perfect or completely null, 0-10 is another story as it is more general by a factor of 10! For example a prominent critic was attributed as giving zero out of 100 to a huge high profile Hollywood movie made by a superstar veteran, so I contacted the critic himself who said he was done trying to deal with the Metacritic idiots they’re so stupid, so while he trashed the film yes, being a huge production and very well made, his review probably was actually a 20 or 30 out of 100?)
    Now imagine how these idiots are doing this rampantly for most critics without actual scores in their reviews and mostly toward the top of the scale? Like routinely utterly in error attributing 100/100 to reviews? It’s LUDICROUS!
    Now, the main problem with these idiot sites is the final numbers they rate at are UNIVERSALLY TOO HIGH! They skew way over regularly. How many 90% plus movies do you see on Rotten Tomatoes? An alarmingly high number. Yet how many do you see below 10%? NONE! Yet movies in general are even worse than ever! At their best, there should be an equal amount on the very low end as the very high end!
    Is this entirely the morons’ fault at RT and MC? No, it is as far as their dumb inaccuracy of averaging to extremes verbal only reviews yeah, but just as much blame goes to shill sheep critics who vastly, sickeningly OVER-rate and OVER-praise in general yet pussy out on really sticking it to all the dogs, which are many! MORE movies are awful than are Great, that’s for sure!
    So what you have ultimately is this over-weighted at the top ratings by these fools! Metacritic is so stupid in fact that they actually slide the user 1-10 scale with it meaning something different for any particular number for their games category than it does for their movies and TV categories! The stupid morons!
    If there are generally more high rated games these days than movies and shows then so be it! 5 is still dead center, meaning right on the border between pass or fail, thumb up or down, or equaling average, whatever! You don’t slide the scale to accomodate generally better reviewed product of that medium so that its averages are more in line with other mediums! I told the morons this way back so maybe they fixed it? I doubt it knowing the idiots!
    Then take Rotten Tomatoes which a while back had a fun little project of rating film classics compared to today’s fare, and the stupid fools did the same fuckin thing! Except they didn’t slide the scale, they extended it! So movies like “The Godfather” for just one example by today’s measure, if it were released now they say it would get a 120% rating or some such and went on to list dozens of older highly regarded movies all coming in at OVER 100% by today’s measure. WHAT?! The stupid beyond belief fools! That is not possible! If a film in 1972 received unanimously positive perfect reviews, it would still clock in at 100% max! And the fact is no film ever was universally praised by ALL as PERFECT!
    All this points to one thing, as stated earlier: these ratings are phony and a shill tool for studio marketing monkey ads. They skew way too high bottom to top! Instead of final film averages ranging on RT from around the too high 20% to ridiculously 98% or such, the overall slate of films released say in a given year should go from around 5% to 95% thus dropping overall about 1/5 of all product downward from artificial highs! Then if the idiots do a retro rating for say “The Godfather” (for the sake of argument about as near perfect as films get but still it is subjective) it would LEGITIMATELY come in at something like 98 or 99%, and absurdly-obscenely over-rated little films like say “Moonlight” (given artificial mass over-praise based purely on affirmative action Leftist sheep critics) would drop down a few points from its ridiculously high score. And that’s no matter how much you may pretend to love that movie! It’s a student film. It ain’t “The Godfather”, it’s not even ballpark comparable. Yet since it was massively over-praised, you can’t deny it the high rating. You must adjust to as good as it gets, so idiots: bring the classics down below 100% and if some of these stupid films today are so masively over-praised then that’s just how it is. You don’t CHANGE THE SCALE!

    Bottom line though, everything across the board is rated too high from the awful being rated bad, to the bad being rated okay, to the ok being good, to the good being called great, to the status of “great” being a much, much too often applied!

    Rotten Tomatoes and Metacritic are incompetent as well as corrupt in their numbers.

    Reply

Leave a Reply to Derek JacobsCancel reply