Asking your patients to write reviews for you online is not only recommended…it’s absolutely essential.
Online reviews have become a key factor in selecting a healthcare center or provider…or any business for that matter. Market research conducted by BrightLocal suggests that 72% of the people use reviews as part of their decision process, and 30% decide solely based on reviews on websites like Google, HealthGrades, Yelp, etc. In a recent event for their partners, Google provided that 88% of patients use a Google search to find a physician or treatment center and 30% just search directly on a review site.
Thus, it is really important for healthcare providers that their online reviews on such platforms are a true reflection of the service that they provide.
Review platforms try their best to ensure that reviews are consistent with the actual performance of a business. Different review platforms have developed their own algorithm to weed out fake reviews; both good & bad. Yelp even forbids asking for reviews because it feels that even just generally asking customers for reviews would bias them towards positive.
While working with our client base of primarily healthcare professionals, we at GMR Web Team (GMR) were fully aware of the quality of service that our clients were providing. Most of them were performing exceptionally, and their patient base was generally happy and loyal towards them. However, their diligence towards providing a satisfactory experience to their patients wasn’t reflecting properly in their review scores online.
GMR felt that the reviews of its clients, who do not have a process of asking for reviews, was inconsistent with their actual patient satisfaction. To confirm this suspicion, GMR decided to institute a patient satisfaction measurement program. Since this was just trying to confirm an assumption, we started with a small sample of our clients: 11 providers.
After a month of diving into the data, we were convinced that there was a major disconnect between actual patient satisfaction and reviews posted online about these providers. What we came to realize was that the number of unhappy patients for these providers was actually very few, but they were more vocal on review sites when compared to happier patients. The reason behind this was that happy patients were mostly reluctant (or less mindful) of writing reviews.
A study by Accreditation Association for Ambulatory Health Care, INC. (AAAHC) suggests the same. It says that satisfied patients will share their positive experience with five others, on average, while dissatisfied patients complain to nine (or more) other people.
Ryan Erskine, Senior Brand Strategist at BrandYourself, had this to say about asking for reviews and the measures that Yelp is taking against this effective practice:
The major issue with anonymous platforms is their tendency to skew toward the negative. That’s true whether you look at review platforms like Yelp and Glassdoor, anonymous forums like 4chan, or the troll-like comments commonly found beneath YouTube videos. Unfortunately, that inherent negative bias means that disgruntled and angry customers are more self-motivated to leave reviews than their satisfied counterparts. That’s why Yelp’s Don’t Ask Policy is so problematic. Businesses that don’t prompt satisfied customers to leave reviews are often left with a self-selected sample of upset customers that doesn’t represent the total client population. By asking customers for reviews, businesses correct for an inherently negative reviewer bias, a move that helps consumers make better-informed purchasing decisions.
How Did GMR Proceed with the Study?
We compiled the online reviews of 11 providers (GMR clients) from six different online review platforms; namely, HealthGrades, RateMDs, Vitals, Google, Facebook, and Yelp. We spent a month measuring how they were getting reviews to their accounts on all six platforms. After a month, we had obtained the average figure of reviews on a monthly basis for all six platforms. This was before the providers had started asking for reviews.
After that, GMR instituted a program where patients were asked to write a review about their experience on any of the review platforms. Patients were told clearly that they should rate and review their providers based on how exactly (good or bad) they felt during their visit. A trickle of reviews started to flow in and after six months, we compared the two sets of data.
Here’s the comparative analysis of the two sets of data that we collected before and after the providers were asking their patients to write reviews:
* Average Monthly Reviews is calculated by dividing the total number of reviews by the total time span (total number of months) in which these reviews were obtained.
** Star Rating is the average of all the ratings given by the patients on a scale of 0 – 5 (star ratings).
As shown in the above data, the average number of monthly reviews for all providers on all six review websites increased greatly. For most providers, the number of average monthly reviews were 0 which saw a significant increase after the providers started asking for reviews. This also resulted in an overall increase in the star ratings for them, however a few also saw minor drops, which proves that the previous star ratings were deceptive, even if they weren’t trending towards negative.
The star rating for Provider 2 shot up from 3.1 to 4.85, which for Provider 5 it shot up from 3.8 to 4.75, while for Providers 8 and 10 it went up from 3.2 to 4.65 and 3.5 to 4.75 respectively. This validated our suspicion about the reviews (before asking) being inconsistent with the actual patient satisfaction. When left alone, the sample size is limited to only patients who were familiar with review sites and were passionate enough to post a review.
The real meaning of an overall more positive online reputation was that positive reviews, which were previously infrequent, had started flowing in (along with the negative reviews) more frequently after the providers started asking their patients to write reviews. GMR was glad to see that happy patients, who previously were reluctant or not mindful about writing reviews, had started sharing their experience.
Providers 1, 4, and 6, registered a slight decrease in their satisfaction score. This was because the original sample size of published reviews was low, and proves that improving the frequency of reviews by asking for reviews isn’t inherently biased towards the positive. Asking for reviews isn’t about inflating a business’s online reputation, but rather ensuring the sample size is large and general enough that it’s showing a true reflection of the business.
By now, we were sure of the fact that increasing the number of reviews is vital for reflecting the true picture of a provider’s patient satisfaction. But here, we had considered 11 providers who we knew were admired by their patients, so we couldn’t completely eliminate the possibility that asking for reviews could bias the ratings towards positive.
To be completely sure about our understanding regarding review solicitation and its effects on reflecting patient satisfaction, GMR decided to conduct the same study through the same method, but on a larger scale. This time, instead of measuring the data for individual practitioners, we decided to take practices into consideration. The reason was that here we get a mix of all kinds of providers, and we know that no large practice is perfect.
Here’s what we found:
Not surprisingly, we got different results for these practices. For Practice 2, there was a slight increase in the star rating (4.57 to 4.85). For Practice 1 and Practice 3, there was a dip seen in the star rating (5 to 4.8 and 5 to 4.7 respectively). For Practice 5, 6 and 7, a clear upshot was seen in the star rating (3.14 to 4.75, 2.17 to 4.45, and 2.75 to 4.7). Practice 4 and 8, that earlier had no reviews, reflected a star rating of 4.7 and 4.35 (which is above average) after they started asking for reviews.
In depth analysis of both studies helped us land on the following conclusion:
- In the cases where the online reviews are consistent with the true picture of the quality of the services, satisfaction score will remain the same even after asking for reviews (as seen in the case of Providers 1, 4, 6, 7 and 11 from Table 1, and Practice 2 from Table 2). The point of benefit in this case is that, with a larger number of reviews, it becomes easier for the prospective patients to trust the review scores.
- Occasionally, a drop is seen in the star rating after asking for reviews (as in the case of Providers 1, 4, and 6 from Table 1, and Practices 1 and 3 from Table 2). This defies the point that review soliciting can have a role in earning reviews biased towards higher star ratings. In fact, it clearly shows that not asking for reviews can give a deceptive picture.
- No reviews are about as bad as negative reviews. When the provider/practice is doing great (as in the case of Practices 4 and 8), but nothing about their services is reflected online, they completely lose credibility from their prospects.
- Some really good healthcare businesses were positioned as just average or below average because the perspective of the majority of their patients was missing from the online reviews (as is clear in the case of Providers 2, 3, 5, 8, and 10 from Table 1, and Practices 5, 6, and 7 from Table 2). After inviting patients to write their reviews, it helped the providers/practices get the true picture.
Your initial thought from seeing the above data may be that the overall scores were still generally positive, so how does this prove a non-positive bias? Our Patient Satisfaction Benchmark Survey Report helped us in supporting the findings that these overall positive ratings were actually consistent with the services provided by these practices. This survey was conducted from January 2017 to June 2017 on 26,000+ patients, and was focused on studying the actual level of patient satisfaction with these providers/practices through a personal survey – free of any bias. Patients were asked to recommend the practice/provider on a scale of 0 to 10 based on their latest experience. Patients were also asked to explain the reasons behind their rating. According to the report:
- 88.71% of the patients visiting urgent cares, 93.75% of them visiting primary cares, and 89.43% of patients visiting dentists reported to have walked away with highly positive feelings.
- The sentiment analysis study of these patients reports that 90% of the patients visiting urgent cares, 92% for primary cares, and 89% for dentists felt happy with the experience.
These were very much in line with the figures that we found in the “after asking” sections of Table 1 and Table 2 in our study.
Based on these observations, we finally understood that review platforms that do not allow review solicitation might be projecting a deceptive picture of patient satisfaction to their audiences. For instance, Yelp says that asking for reviews biases it towards positive, and it uses a Northwestern University Study as a basis behind supporting their “no review solicitation” policy. Yelp supports its argument with one of the findings of the study that says solicited reviews tend to be higher than self-motivated reviews by 0.5 star rating.
While researchers of the study indeed admit that solicited reviews tend to be slightly higher in rating than those of “self-motivated web reviewers,” they actually concluded that this is because the latter group is biased. “Dissatisfied customers are more likely to be vocal about their dissatisfaction than satisfied customers about their satisfaction, and this phenomenon is expected to induce a selection bias where self-motivated reviewers are more likely to be dissatisfied. Hence, we expect to see a larger percentage of lower ratings coming from web reviews than email reviews,” as stated in the study.
The study also highlights that, over time, there’s naturally occurring negative bias if reviews are not solicited.
What’s more – the study, in its conclusion, encourages businesses to solicit reviews through emails as it will lead to a larger, more representative, and more credible population of reviewers.
Which Review Sites Truly Reflect Patient Satisfaction
Review sites that represent both (happy and unhappy) segments of reviewers can only provide a more complete picture of a provider’s ability to provide a great service. Since Yelp’s review solicitation policy advises against asking your patients to write reviews for your services, it’s doubtful whether it could actually present a complete picture of patient satisfaction.
Based on our experiences, we recommend focusing on these review sites as a healthcare provider, in order of importance:
- Google – 88% search for physicians directly on Google
- Facebook – Most popular social site and highly regarded by Google
- HealthGrades – Highly regarded by Google and featured in the side cards of a Google search
- RateMDs – Highly regarded by Google and featured in the side cards of a Google search
- Yelp – Ranks highly on Google, but is in direct competition with them and only appears as a page in the search results. A very small percentage will search directly on Yelp for a provider.
Asking your patients to write reviews actually works in favor of prospective patients that are searching for a care provider. It gives voice to your patients to raise their issues (if any), and ensure they are heard and solved by you. As a result, your current patients will receive a better experience and prospective patients will have more accurate information of your practice before selecting you as their care provider.