Google has published an explainer that provides insights into how Google handles reviews left for local businesses on Google Maps. The article describes the various steps and actions that Google takes that enable it to review and publish user-generated comments in a matter of seconds.
Google has shared five steps it needs to take to ensure that Google Maps reviews are helpful and authentic.
Step 1: Strict content policies
The backbone of Google’s approach to moderating comments left on Google Maps is a well-defined content policy.
Every website that accepts user-generated content should have a well-defined policy that describes what is acceptable. This helps users understand limits and also informs moderators when to intervene.
“We have strict content policies in place to ensure that reviews are based on real-world experiences and to keep inappropriate and offensive comments out of Google Business Profiles.”
Key points about the Google Maps review content policy
Google’s content policy defines the result they’re trying to encourage:
“Contributions should be based on real experiences and information.”
The Google Content Policy defines six types of prohibited activities.
Examples of review content that violates the map review policy:
- Intentionally fake content
- Copied or stolen photos
- Reviews are off topic
- Defamatory language
- Personal attacks
- Unnecessary or incorrect content
Step 2. The content policy is integrated into the Google algorithm
Google’s next step to protect the integrity of Google Maps reviews is to integrate the content policy into Google’s algorithms, using the policy as training data for the algorithm and its human maintainers.
“Once a policy is written, it is turned into training materials — both for operators and for our machine learning algorithms — to help our teams detect content that violates the policy and ultimately keep Google feedback useful and authoritative.”
Step 3. Google moderates reviews immediately
Google shares that all revisions are sent to their moderation systems for review once the revision is published.
Google uses a mixture of human and automated review systems. Google’s algorithms can process a review and pass it on for publication in a matter of seconds.
Google has traditionally preferred to scale its systems using algorithms rather than relying on humans to complete tasks.
The algorithm looks at many factors to determine if a review is fake.
Google names some of the review factors:
- Is the content offensive?
- Is the content off topic?
- Is an account leaving a review engaging in suspicious behavior?
- Is the sudden spike in reviews related to the news or interest in social media spurring false reviews?
Google shares how its automated system works:
Once someone posts a comment, we send it to our moderation system to ensure that the review doesn’t violate any of our policies.
… Given the volume of reviews we receive regularly, we’ve found that we need the nuanced understanding that humans provide and the volume that machines provide to help us curate contributed content.”
Step 4. Google encourages community moderation
Google has stated that it encourages companies and the public to submit reports of fake reviews.
This is a standard way to set User Generated Content (UGC).
This method is sometimes called Report-a-Post. Report-a-Post is great because it makes users feel part of a community and crowd-sources the review function, allowing users and companies to apply their unique perspective to catch bad reviews that might slip past a moderator or algorithm.
Step 5. Google is a proactive company and expects fake reviews
An interesting fact shared by Google is that they are proactive about anticipating events that may lead to abusive reviews. Google provides extensive monitoring of reviews of businesses located in those event areas to ensure that only reliable and useful reviews are published.
“For example, when there is an upcoming event that has a large following – such as an election – we apply high protections to places associated with the event and other nearby businesses that people might look for on Maps.”
Machine learning plus human oversight of Google Maps reviews
Google’s approach to moderating user-generated content follows a long-standing approach pioneered in forums and blogs, including using automated systems to deal with users and events that can lead to a higher chance of offensive content.
This article is useful because the steps Google is taking can serve as an inspiration and model for crafting an approach to moderate UGC on any website or platform that accepts UGC.
Not curating user generated content on your site can result in penalties as well as a poor user experience. For Google, protecting comments from spam is about user expectation of trust and providing a better user experience. If Google Maps reviews are filled with spam, no one wins because users will lose faith in the reviews and that will be bad for companies that rely on Google Maps for business.