Confessions of a Review Site Owner

How the review game has changed over the last 10 years

It all started in early 2006.  While only 9.5 human years ago, it’s an eternity in internet years.  The iPhone hadn’t been invented.  Mobile web browsers were a joke.  Facebook was still only available to college campuses.  Online reviews weren’t widely found outside of Amazon.

It was a great time to start  The idea first came to me when my college roommate gave me a bottle of NO-Xplode which he had paid $60 for – this was a ton of cash back in college.  He took it a few times and just couldn’t stomach it.  “Too bad he didn’t know before dishing out the $60,” I thought to myself.  And that’s when it clicked.

In 2006, major online supplement retailers such as and didn’t have any customer reviews.  The only info they provided was what the brand wrote – and brands will basically write anything to get you to buy the product.

I used online tutorials to learn some basic web development skills, and within a month, my website was ready to launch.  Submitting a review only involved writing some text in a box, picking a rating between 0 and 10 and clicking “submit”.  We didn’t ask your name, age, email, gender, experience or record any other info.  It was completely anonymous, had zero accountability and ran on the honor system.  And it worked.

The site was an instant success.  We started getting dozens of reviews each day.  Every review submitted was published and counted equally.  The product rating was simply the raw data average.  The site was rough, but I was continually making small tweaks and improvements.

Until about 2009, we were still small enough to fly under the radar.  Brands didn’t have much incentive to fudge their ratings because of our limited presence.  But as our reach and power of influence grew, so did the stakes for being number one on our site.

First Line of Defense: Accountability Through Registration and Email Verification

Between 2009 and 2010, we phased in a user system and eventually required all users to sign up before they could submit a review.  This drastically reduced the quantity of reviews submitted to the site, but helped weed out a lot of the obviously biased ones.  Also, we could now take a look at their email address.  Many times, when a review looked suspicious, we could see that the email was “[email protected]”.  This was an easy way for us to tell if someone was stuffing the ballot.

With a registration system, we could also better track the IP addresses that each user logged in with.  This enabled us to see connections between suspicious reviews and determine whether or not it was one person writing all the reviews.

Despite this, we still noticed slanted reviews coming in.  They’d usually arrive in clusters; 4 or 5 reviews in a row, all raving about how the product is the best supplement they’ve ever used.  Our regular community members would start flagging these obviously biased reviews and get personally offended that some rep would come in and spam their product like that.

We took offense to it as well, so we decided to fight back.

Fighting Fake Reviews with Public Shaming

In 2011, we started the Hall of Shame.  This was a dedicated forum where we would publicly announce companies who we caught stuffing the ballot and creating fake reviews.

Since I have database administrator privileges, I could see a lot more information on each user than they were even aware of.  When our community members reported suspicious reviews, we could use our detective skills to look into the data and see if we could draw any conclusions.

Here’s a few ways we could tell if a brand rep was posting fake reviews:

  • Search their email on social media.  Every user had to verify an email address, which nobody except for me could see.  You’d be surprised at how much info you can gather on someone just by stalking them on social media channels.
  • Google their username to see if they were signed up at any other forums.  People often recycle the same usernames, so it’s easy to see their posting history on other fitness forums, Reddit, etc.
  • Track their IP and see if it matches other reviewers of the same product, or if the location is the same city as their business headquarters.

Every time we publicly shamed someone, we’d share the juicy details on our discovery.  Our community was excited we were proactively combating the fakers.  We caught nearly 100 different companies red-handed using this method.

Although we made it very clear to all new members that you could end up in the Hall of Shame if you are posting reviews as a rep, it didn’t seem to phase them.  The biased reviews kept coming in.

Incentivized Reviews: Not “Fake” but Still Biased in the Supplement Industry

In 2011, as Facebook was in a period of explosive growth, we noticed a disturbing trend of a fake-looking reviews popping up on our site.  Our usual detective work wasn’t turning up any dirt because these weren’t the brand owners or reps writing the reviews.  In fact, it was actual customers writing reviews on products they had used, so our Hall of Shame was futile against these attacks.

After a few months of scratching our heads and wondering what to do, it finally clicked.  Brands were offering their loyal followers an incentive to sign up to our site and write a review.  Sometimes it was a discount code, sometimes it was a free t-shirt, shaker cup, samples, or an entry to win a bigger prize.

Even though these reviews were from real customers, they were completely biased and we felt they didn’t belong on our site.  It was very rare to see an incentive review that wasn’t a 9 or 10 out of 10.

Disclaimer:  Incentive reviews don’t work in the supplement industry, but they may be viable in other industries.

There is a TON of brand loyalty and marketing hype in the supplement industry.  Supplement companies often sign professional athletes or bodybuilders as a marketing tool to tap into their existing fan-base.  We see a lot of sensationalism because products are based on confusing pseudo-science and appeal to one’s emotional desire to look better.

The average Joe isn’t always qualified to review products since most of the time the results are so subtle that they have no idea if it’s actually working or not, and end up simply regurgitating the marketing on the label.  We even tell brands NOT to send their fans to our site for reviews, and instead offer an avenue for them to send their products to our experienced reviewers – a solution that keeps the playing field level.

This isn’t the case in most other industries.  For example, if someone paints your house, the laymen is well equipped to comment on their promptness, responsiveness, and quality of the work.  You’ll rarely see a painting company hire professional bodybuilders or athletes to generate a die-hard loyal following.  Furthermore, it’s impossible for a painter to send free house paintings out to experienced reviewers.

Our policy to disallow incentivized supplement reviews was a decision made based only on the current environment of our industry, and doesn’t necessarily condemn incentivized reviews in other industries.

Sadly, as we had been learning, just asking nicely isn’t going to stop anyone from trying to cheat the system.  We were determined to find a clever, data-driven solution.

Reviewer Authenticity Score: Our First Step Towards a Data-Driven Solution

In 2012, we had major drama with one brand.  They created hundreds of accounts, all with different IP addresses, emails, usernames, and posted quality (positive) reviews on their products.  The community was up in arms seeing the 10/10 reviews coming in every single day, knowing they must be fake, and my detective work was coming up with nothing.  Did they actually beat us?

Up late one night, while frantically searching the data to see if there was any trail of evidence strong enough to delete and ban all those fake reviewers, I had another “ah-ha” moment.  What if we write an algorithm to automatically look at all the data we have and generate an “Authenticity” score for each product?

A few weeks later, every product had been assigned a “Reviewer Authenticity Sore” – a number from 0% to 100% which represented how “natural” the reviews for that product appeared.  For example, if all the reviews were written on the same day by one-hitter-quitter users (a user that signs up, posts one review and never comes back), that product would probably have a 0% Reviewer Authenticity and get a big “Mistrusted” flag on their product.

I was pretty smug about this.  It seemed to be working pretty well, but there were a few fatal flaws.  First, it wasn’t 100% accurate.  There were false positives and brand reps were not happy about that.  Second, and most fatally, it was not very well understood by the laymen.  It was becoming obvious that this advanced concept was going over the heads of users who just want to instantly know “is this product good?”.

We ended up rolling it back in 2014 to make way for something a little more effective.

Not All Reviews are Created Equal: Member Trust and Weighted Averages

By our 8th birthday, our community had built up a core group of trusted, established members.  They are known for having a critical eye for supplements, plenty of experience and ability to write detailed, thorough reviews.  How was it fair that a review from one of the most trusted and established members counts the exact same as someone who signed up yesterday, posted one low-quality review and never came back again?

It wasn’t fair, and that’s why we decided to create Member Trust and Weighted Averages.  Here’s how it works: every new user starts with 0% Trust and has to work their way up to 100% by submitting quality reviews, participating in our community and sticking around a for long period of time.  Many users never achieve 100% Trust, and their reviews aren’t weighted as heavily.

Our calculation for the average product rating changed as well.  While the math may be difficult for the layman to understand, it still produces one single number that everyone can relate to.  Our formula takes into account Member Trust, review age, and quantity of reviews before spitting out an average rating for each product.

No longer can companies spam their way to the top of our list by sending a bunch of their followers to post 10/10 reviews on our site.  This solution was completely effective at keeping our data and average ratings free from brand influence, but we still had the problem of suspicious reviews showing up (even though they didn’t actually count).

Nail in the coffin with TROOPs and Approval Center

It was early 2015, and we were still noticing brand reps sending their loyal fan army to flood our site with low-quality, one-hitter-quitter reviews on their products.  These reviews didn’t actually affect their ratings, but they were still spamming away – either they didn’t understand or they just didn’t care.

Our community was still upset about these types of reviews invading our site, and we decided that we’d had enough.  On our 9th anniversary, we created the Approval Center.  This meant that all new members had to get their reviews manually approved by at least 3 of our top members before they’d be published on our site.

The quality standard for reviews skyrocketed, and anyone trying to push a low-quality review through was rejected and sent back for edits.  Low-quality reviews were a thing of the past.  Although the problem was solved, brand reps were still unhappy that it was now impossible to get reviews on our site.

In response to their concerns, we set up the Trusted Reviewer Open Opportunity Program, or SR TROOPs for short.  This gives every brand rep the chance to get reviews from trusted members on our site.  All they have to do is send out some products and they’ll get their unbiased reviews.  Their products get real reviews, our reviewer gets a free product, and nobody is upset about fake reviews getting posted on our site.

Finally, the ratings on our site are based on the merits of the actual product, rather than the size of the brand’s following.

What’s next?

Over the last 10 years, the only way we’ve been able to combat fake, low-quality and biased reviews is to spend a lot of effort vetting each new reviewer.  The anonymous free-for-all strategy has long been extinct and we’ve found success in encouraging quality over quantity.

Our plans for the future include getting more intimate and exclusive with our core group of trusted reviewers, and carefully growing this team so that the vast majority of reviews on our site are written by the regulars, not the randoms that show up to post one review and then disappear into cyberspace.

Tommy Noonan is the founder of, an unbiased review site dedicated to bringing honesty into an industry full of scams, spin and crazy marketing hype.