Hi.

Welcome to my blog. I document my interests in films, tv shows, and more.

How Rotten Tomatoes Can Be Misleading

How Rotten Tomatoes Can Be Misleading

Disclaimer: I wrote this article back on July 2, 2012.

Before I actually started writing my own movie reviews I read a lot of reviews from professional film critics. When I started this blog back in 2010, I used a website called Metacritic to see which films were worth seeing and which were not. The site compiles reviews from professional critics and stamps an averaged score for each film, which makes it easy to know if a film is generally good, bad, or just average. I don’t know when I started to switch over to Rotten Tomatoes, but when I did I noticed key differences between the two review-compiling websites.

I’ve found that many, many people are misled by Rotten Tomatoes' way of rating films. A lot of people believe that the percentage score for a film on the site represents how great a film is out of 100. For example, a person can see that a film has an 88% score on Rotten Tomatoes and believe that the film is that good. 88 percent out of 100 good. This isn’t exactly the case. The film with the 88% score is not necessarily that good, rather, 88% of critics gave the film a positive score. 88% of the critics that review this film could have given the film a positive 3/4-score (75%), but Rotten Tomatoes will still stamp on an 88% positive score for the film because it’s exactly that – a “positive score.” The reality is that the positive scores range from good scores like 70/100 to perfect scores of 100/100, but they’ll just be considered “positive” to the website.

Metacritic’s way of averaging critic reviews is more straight-forward. They don’t group reviews into positive and negative piles. Instead, they convert every review to a score and then average them all up into one big score that gets stamped at the top of each movie. When a critic reviews on a 4-star scale Metacritic converts the rating to a numbered score out of 100. A 4-star film is considered 100, 3.5 is 88, 3 is 75, 2.5 is 63, 2 is 50, 1.5 is 38, 1 is 25, 0.5 is 12, and a 0 is 0. Letter grades are also converted on a specific scale: A or A+ are 100, A- is 91, B+ is 93, B is 75, B- is 67, C+ is 58, C is 50, C- is 42, D+ is 33, D is 25, D- is 16, F+ is 8 and F and F- are 0.

It should be noted that Metacritic utilizes a weighted average for their scores, so the score for every movie may not be the exact average of all the reviews listed. The averaged scores are slightly curved and may be a little lower or higher than the actual “exact” average. The site does this to provide more of a distinction between scores that allow a better comparison of scores for movies. They also use a weighted average because some critics may write better, more thought-out reviews than others, and some critics may be more prestigious and respected than others. Critics like these are considered more important and counted more when averaging the reviews. Metacritic also uses colors that help signify the quality of the film that’s scored. Green scores means the average critic gave the film a good score, yellow means average/mixed, and red means bad. An average score between 81 and 100 signifies that a film has “Universal Acclaim” (green). A film with an average score between 61 and 80 is considered to have “Generally Favorable Reviews” (green). A film with an average score between 40 and 60 is considered to have “Mixed or Average Reviews” (yellow). A film with an average score between 20 and 39 is considered to have “Generally Unfavorable Reviews” (red). And an average score between 0 and 19 means there is “Overwhelming Dislike” for a film (red).

Though Metacritic is a more precise website for averaging film review scores, I use Rotten Tomatoes more now because I feel like it’s more user friendly. I like that the site writes a general consensus sentence for every score they give out. It makes the score more engaging because it states the general opinion of what the film is like. I also like how pictures of the critics are used next to their reviews. These pictures and bright colors of the site makes it more appealing to the eyes. I also want to note that when I look at Rotten Tomatoes scores, I usually click the “top critics” button next to the score to see the percentage of positive reviews professional critics actually wrote. The percentage for the top critics is usually different (sometimes slightly, sometimes significantly) than the percentage that includes all critics.

It’s not Rotten Tomatoes' fault that people are misled by their process. The site clearly explains how it works, but people visiting the site for quick opinions on films are often mistaken in believing the positivity percentage translate directly to quality.

Second Citizen | Humans Being Episode 3

Second Citizen | Humans Being Episode 3

My Favorite Films of 2016

My Favorite Films of 2016