Star gazing … how reliable are online user ratings?

When we’re buying something on Amazon, we all glance at the user ratings, right?

5-stars, it’s a keeper … 1 star it’s a bummer.

Real reviews from real users.

What could be more accurate?

clip_image002

=======

Some researchers tried to answer that question.

Since Consumer Reports has been in the quality testing business for decades with a reputation for rigor, objectivity and impartiality … So, to test the reliability of user ratings, the researchers took the Consumer Reports’ scores for 1,272 products and compared them to more than 300,000 Amazon ratings for the same items.

Their findings may surprise you …

======

According to an ARStechnica.com summary of a JCR paper titled Navigating by the Stars: Investigating the Actual and Perceived Validity of Online User Ratings” …

Researchers found that there was a very poor correlation between user ratings and CR scores” Products with higher user ratings weren’t particularly likely to have great CR scores, or vice versa.

For around a third of the product categories tested, the correlations were actually negative (that is, the higher the CR score, the lower the Amazon ratings).

What the heck is going on?

It’s more than companies dishing incentives for positive reviews or using bots to stuff the ballot boxes.

There are some fundamental problems.

======

First, there are sampling issues.

Some items have very few respondents … hardly a projectable sample.

But, even products with high numbers of very similar ratings didn’t correlate particularly well.

Why not?

Well, in part, there’s the question of sample bias.

“The kind of person who writes a review isn’t necessarily a good representative of all people who bought the product. Review-writers are likely to be people who have had either a very positive or very negative response to a product.”

And, those vociferous “bell cows” may stimulate the herding instinct of dependent-minded followers.

======

Even correcting for sample size and bias, there are some subtler issues.

For example, user ratings seem to be heavily influenced by brand image.

Premium brands and more expensive products had inflated user ratings.

It’s possible that expensive, premium products are better quality and therefore deserving of those ratings.

However, the researchers controlled for CR scores to get a sense for whether premium products were rated higher regardless of their quality.

They were — and brand image explained a lot more of the variability in the ratings than quality did.

People who base their purchasing decisions on user ratings are obviously getting information that extends beyond quality.

=======

So, how to use the reviews?

Go ahead and get a general sense from the stars earned.

Then, dig deep into the written reviews … especially the negative ones.

Even though they may be few in number, they can usually give you a sense of how bitter a lemon might be … in MBA-speak, they can help you calibrate the downside risk.

======

Thanks to Eddie C for feeding the lead.

======

#HomaFiles

Follow on Twitter @KenHoma            >> Latest Posts

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s