Podcasts

Talking Movies: Decoding the Movie Ratings at IMDb and Rotten Tomatoes

Welcome to Talking Movies, I’m Spling.

This week we’re talking about Decoding the Movie Ratings at IMDb and Rotten Tomatoes. Before the internet, we only had newspapers, magazines, television, radio, and word of mouth for our steady supply of movie culture intrigue and information.

Having information at our fingertips is empowering, and it’s how consumers are making their choices in this digital age. Why wouldn’t you do a quick search, consult AI, or reference a website before making a decision? The same goes for film reviews and movie critics like myself. There was a time when readers, viewers, and listeners would only have a few voices when it came to picking the right film for movie night.

While a subjective process, readers would be able to decipher or learn to trust certain voices. Figuring out the critic’s special interests, favorite genre, general disposition, and even their rating allocation would help make them relevant to people, even if they didn’t necessarily agree with their ratings. Now that the floodgates have opened, you can get opinions on film from the well-respected RogerEbert.com right through to your movie buddy’s social media status update.

Film critics are now pontificating across the spectrum of media from YouTube channels and blogs through to more traditional print media and established online publications. Moviegoers now feel armed with the power of IMDb and Rotten Tomatoes, which can guide them to better movies based on their ranking systems. IMDb’s rating system provides an out-of-10 score based on general ratings and links to critic reviews.

Giving viewers a semblance of what to expect in terms of thousands of opinions boiled down to a number, it seems like a safe bet. Respected by industry professionals and movie lovers, it’s widely referenced, often ranking high in Google searches. However, it’s not the be-all and end-all, often skewed by the loyal user base’s demographics, tastes and preferences.

In an ideal world, the rating system would be used in a way to best represent a verified viewer’s take, but it’s subject to abuse. Besides voting syndicates using the platform to skew public opinion about certain films, there’s a dedicated bunch who try their best to protect their hallowed selection of films from being downgraded in stature. Just click through one of the out-of-10 ratings on IMDb to see the voting allocation.

Almost every film has a scattered allocation of ratings with a glut of 10/10 and 1/10 ratings. There’s also no way to verify a film has been watched by a user, and being based on an honesty system, there’s no real way to confirm raters have even seen the movie they’re scoring. Influenced by public perception, boosting a film’s rating just to sell more tickets or trashing it to thwart its success makes it a flawed system that somehow ends up representing a film’s overall standing.

Rotten Tomatoes is strict when it comes to accepting film critics into the fold, which makes their system seem more robust. While it’s easier to get an idea of what the general critic’s position is based on consensus and capsule reviews, there are some issues. For starters, most people don’t understand the tomato meter percentage score that is often referenced as the standalone component when deciding on a movie.

This is the percentage of critics who gave fresh reviews of a film. It’s not a sliding scale comparable with an out-of-10 rating. You can ostensibly get a 6 out of 10 movie that has a 100% fresh score, meaning every critic who reviewed the film thought that it was fresh, but otherwise just better than average.

Another issue is determining a mean score from the film critics, since each of them is working on a different rating system or none at all. How do you credit a positive review with a rating, or how do you determine a score based on an out-of-4 rating system versus an out-of-10 one? These discrepancies can alter ratings quite substantially when pulling from a selection of less than 100 reviews. While Rotten Tomatoes have moved away from aggregating critic scores, there does seem to be a need to anchor the tomato meter to another element for context.

User reviews have also come into play on the website, allowing ordinary filmgoers to flex their film critic muscle by chiming in with a review and rating. This is more quantifiable and useful to see how the audience score matches up against the overall movie critic percentage, submitting their opinion via the same channel makes it possible for Rotten Tomatoes to offer a considered consensus, which has more credibility when linked to a user account. Taking the time to string a few sentences together also means it’s easier to sift the have-seens from the have-not-seens.

Perhaps they should really be looking into forcing critics to adopt a similar type of submission scheme. As it stands, there isn’t a perfect system, since each of the consensus ratings are done on an unverifiable reviewer rating. While Rotten Tomatoes distinguishes super reviewers and has its certified movie critics, it seems that it’s coming full circle.

Since social media and faceless publications have proven to be fallible, often with ulterior motives, the need for experts who have become trusted authorities in fields is on the rise again. While the power of consensus and trust has become a currency through apps with linkbacks to Facebook profiles for credibility, these avatar-based systems can only take one so far. What entertainment journalism and news reporting in general requires is time-honoured integrity and the transparency to win people’s trust without a shadow of suspicion.

For more movie reviews, interviews and previous Talking Movies podcasts, visit splingmovies.com. And remember, don’t wing it, SPL!NG it.