Online reviews are an emerging and increasingly important market phenomenon. Reviews play a large role in consumer decision-making providing not only product information but, more importantly, reporting other consumers’ experiences with the product. While marketers have long studied the strategies of communications with consumers in the word-of-mouth literature, little is known about the peculiarities of online reviews:
What reviewers write depends on when they write it. The order of rating and writing a review can influence the reviewer’s own attitudes and how readers interpret the review. Reviewers rate products and consumers rate reviews. This feedback helps subsequent consumers focus on the “best” reviews. But what makes a good review?
There are a lot of online review websites – both independent and biased. The biased ones are not as important because they are mostly operated by competitors and most people know this. Independent reviews are the most genuine and the most influential when it comes to consumer behavior. Most of the comments appearing on independent online reviews, whether negative or positive, are honest. Therefore, consumers trust them more when trying to decide whether or not to visit a business. Your company should do everything in its power to generate strategies for positive reviews from your customers.
Sometimes, all you have to do is ask your customers to go online and let the world know what they thought about your product. Most happy customers have no problem doing this, so there’s no reason not to ask. This can be extremely powerful in helping you quickly build up a solid profile of good reviews.
Many times customers will come up and personally give you a complement. This is a wonderful opportunity to request that they fill out a review card, and give you permission to post it online. You must be careful when developing strategies for soliciting online reviews. There are many pitfalls, which must be avoided in order to maintain an outstanding reputation.
There are several practices that should NEVER be done when attempting to generate online reviews for your business. These practices are unethical and sometimes violate Federal Trade Commission (FTC) regulations resulting in big fines:
No matter how hard you try, it is impossible to please everyone. Within customers groups, there will always be people that form a negative view toward some aspect of your company. Negative reviews are a part of business and contingency plans should be in place to mitigate this sentiment. This is another factor as to why obtaining positive reviews should be the main focus. By bolstering the amount of positive reviews, these few disgruntled customers will be less prominent when they do surface. Follow these strategies in dealing with customers that are spreading negative reviews online:
Very little research has focused on the effects of review writing on the writers themselves. In order to address this need, we consider a number of questions related to review writing and attitudes; in particular, we explore how the format of the review task affects the process of review writing and readers’ comprehension of the review.
We start by assuming that a consumer has recently engaged in a consumption experience and holds some attitude toward the experience, however weakly formed. We consider two different scenarios, in which consumers write a text review either before or after reporting their attitude regarding the consumption experience. Our central argument is that these two scenarios generate fundamentally different motivations, which in turn dictate the reviewing process and downstream consequences. When the review task precedes any rating, reviewers are motivated to address the different views of unknown readers. This motivation leads to a more open-ended, unstructured review task that allows the reviewer to elaborate extensively on different aspects of the experience and his or her reactions towards them. Such an ‘explorative unfolding’ of information allows the consumer to bring to mind a wide variety of attitude-relevant information, resulting in attitude change.
A stark contrast exists when the reviewing task follows evaluation. In this case, reviewers who have already reported their attitude will feel committed to their rating and utilize their review to engage in “defensive bolstering”. However, individuals are generally poor at recognizing the reasons underlying their preferences, and this inability can lead reviewers to encounter difficulty generating sufficient support for their stated evaluations. In keeping with the principle of metacognitive inference, we suggest that this perceived difficulty may result in attitude moderation. Moreover, due to their one-sided, purposeful nature, text reviews written after a rating may not be very effective at conveying the authors’ attitude to readers.
In a series of lab experiments, hypotheses related to the two different cases above were examined. A pretest utilizing various short animated movie clips revealed consistent attitude differences between participants who wrote a text review of a clip and participants who simply evaluated the movie. These pretest results provided initial evidence that the mere act of writing a text review changes reviewers’ attitudes.
Study 1 explored the effect of task order on attitude change and actual review content. In the experiment, undergraduate participants (N=67) watched the target stimulus, a short animated movie clip, and were then given different instructions according to condition. Two factors were manipulated: task order and type of writing. Regarding the first factor, write-then-rate conditions completed the writing task and then rated the movie, while rate-then-write conditions completed the steps in the opposite order. Regarding the second factor, all participants were asked to generate either a text review of the movie or a filler (summarizing events of the preceding day). In a follow-up session occurring three weeks later, participants were reminded of the movie clip and asked to recall their evaluation.
Replicating the findings, analyses revealed that at t1, the mean attitude towards the movie was considerably lower for the review-then-rate condition than the other three conditions (which did not differ). At t2, this attitude remained basically unchanged. For the rate-then-review condition, not only was t2 attitude significantly lower than that of the control groups, but the absolute difference in attitudes between t1 and t2 was significant as well. These findings support our argument that even after initial assessment, attitude was moderated by the process of writing a text review.
In order to examine review content itself, the Linguistic Inquiry and Word Count tool was applied to analyze reviews written by the two text review conditions. Among other findings, analyses revealed that the review-first group wrote significantly more words than the rate-first group, indicating more elaboration of the movie experience. In addition, the ratio of positive-to-negative word use was closer to one for the review-first group than the rate-first group, indicating that the writing of the former was more balanced. Finally, the review-first group made greater use of articles, suggesting a more objective writing style.
Study 2 examined how differences in review content influence readers’ ability to infer the attitude of the reviewer. Among the reviews written by participants in Study 1, six each were selected from the review-first group and the rate-first group. Participants (N=68) were asked to read the reviews, estimate what ratings the authors assigned the movie clip, and report their confidence in these estimates.
Rating discrepancies were calculated by taking the absolute difference between a reader’s estimated rating and the author’s actual rating from Study 1. In support of the prediction, analyses indicated that rating discrepancies were lower for reviews written by the review-first group that for the rate-first group. Notably, participants were actually more confident in their estimates after reading reviews from the rate-first group, despite also being more inaccurate. Given the text analysis results reported above, it is likely that there balanced reviews of the review-first group seemed more ambiguous to the readers but actually conveyed the writers’ opinion more effectively.
Overall, the results provide initial evidence that reviews written before vs. after global evaluation affect reviewers’ attitudes in systematic ways, differ in terms of structure and content, and create different interpretations among downstream readers. These findings bear important implications for those researching online reviews and the writing process, and also for marketing practitioners interested in utilizing consumer-created content.
The past decade has observed a dramatic increase in the use of consumer-generated content (CGC). Some of the most popular local listing sites for customer online reviews include Google+, Yelp, CitySearch, and MerchantCircle. These review sites are the heart of local business because they can make or break a company. Modern consumers are involved in creating, sharing, and reading content, and these behaviors have received increased attention from marketing practitioners and scholars. In particular, there has been a growing interest in strategies and the effects of product reviews on consumers’ purchase behavior.
Sorry, we aren't online at the moment. Leave a message.
Need more help? Save time by starting your support request online.