Google published an explainer that offers insights into how Google handles reviews left for local businesses on Google Maps. The article outlines the multiple steps and actions Google takes that enables it to review and publish user generated reviews in a matter of seconds.
Google shared five steps it takes to ensure that Google Maps reviews are useful and authentic.
Step 1: Strict Content Policies
The backbone of Google’s approach to moderating reviews left on Google Maps is a well defined content policy.
Every website that accepts user generated content must have a well-defined policy describing what is acceptable. This helps users understand the limits and also informs the moderators on when to step in.
“We’ve created strict content policies to make sure reviews are based on real-world experiences and to keep irrelevant and offensive comments off of Google Business Profiles.”
Key points about Google Maps Review Content Policy
Google’s content policy outlines the outcome they are trying to encourage:
“Contributions must be based on real experiences and information.”
Google’s content policy outlines six kinds of activity that are prohibited.
Examples of Review Content that Violate Map Review Policy:
- Deliberately fake content
- Copied or stolen photos
- Off-topic reviews
- Defamatory language
- Personal attacks
- Unnecessary or incorrect content
Step 2. Content Policy Is Integrated Into Google’s Algorithm
The next step Google takes to protect the integrity of the Google Maps Reviews is to integrate the content policy into Google’s algorithms by using the policy as training data for the algorithm and for its human moderators.
“Once a policy is written, it’s turned into training material — both for our operators and machine learning algorithms — to help our teams catch policy-violating content and ultimately keep Google reviews helpful and authentic.”
Step 3. Reviews are Immediately Moderated by Google
Google shares that that all reviews are sent to their moderation systems for review as soon as the review is posted.
Google uses a mix of human and machine review systems. Google’s algorithms can process a review and give it a pass for publication within a matter of seconds.
Google has traditionally preferred to scale their systems with algorithms rather than depend on humans to complete tasks.
The algorithm looks at many factors to determine if a review is fake.
Google names a few of the review factors:
- Is the content offensive?
- Is the content off-topic?
- Is the account leaving the review engage in suspicious behavior?
- Is a spike in reviews related to news or social media attention which motivates fake reviews?
Google shares how its automated system works:
“As soon as someone posts a review, we send it to our moderation system to make sure the review doesn’t violate any of our policies.
…Given the volume of reviews we regularly receive, we’ve found that we need both the nuanced understanding that humans offer and the scale that machines provide to help us moderate contributed content.”
Step 4. Google Encourages Community Moderation
Google stated that it encourages businesses and the public to submit reports of fake reviews.
This is a standard method of moderating user generated content (UGC).
This approach is sometimes called Report-a-Post. Report-a-Post is great because it makes users feel a part of a community and it crowd-sources the review function, allowing a users and businesses to apply their unique viewpoint to catch bad reviews that might slip past a moderator or an algorithm.
Step 5. Google is Proactive and Anticipates Fake Reviews
An interesting fact that Google shared is that it is proactive about anticipating events that could lead to abusive reviews. Google provides heightened monitoring of reviews of businesses that are in the areas of those events in order to make sure that only authentic and useful reviews are published.
“For instance, when there’s an upcoming event with a significant following — such as an election — we implement elevated protections to the places associated with the event and other nearby businesses that people might look for on Maps.”
Machine Learning Plus Human Moderation of Google Maps Reviews
Google’s approach to moderating user generated content follows a longstanding approach that was pioneered on forums and blogs, including the use of automated systems to deal with users and events that can lead to a greater chance of abusive content.
This article is useful because the steps Google takes can serve as an inspiration and a template for formulating an approach to moderating user generated content on any website or platform that accepts user content.
Failure to moderate user generated content on your own site can result in penalties plus it’s a bad user experience. For Google, protecting reviews against spam is about user expectation of trust and providing a better user experience. If Google Maps reviews were to fill with spam, nobody wins because users will lose trust in the reviews and that will be bad for the businesses who depend on Google Maps for business.