Google published research that studied the effects of user generated content (UGC) spam on user experience. The research uncovered interesting insights on user experience and points the way to further research.
An author of the research, Dr. Sowmya Karunakaran, is the Research Lead at Google Trust and Safety team. The Trust and Safety team tackles bad users on Google’s various products including Youtube, Search, Maps, Gmail, and Google Ads, with a focus on malware, spam and account hijacking.
They created a scale for measuring user experience called the HaBuT scale. HaBuT stands for Happiness, Burden and Trust.
They discovered an attribute they called Burden that encompasses offensive experience, annoying experience and irrelevant, inappropriate or spamminess of the experience.
In regard to offensive and abusive UGC spam discussed below, this was researched within the context of reviews on Google Play. In this context, an offensive or abusive UGC spam were those that resembled: “Idiotic morons!”
This research was focused on UGC spam on a Google Play type of review page. The methodology was to show 100 reviews to each group of research subjects, of which a certain percentage were a specific kind of spam.
The study was conducted on 3,300 participants from three countries, India, South Korea and USA. There were 1,100 research participants from each country.
The goal was to identify which kind of UGC spam negatively affected the user experience the most. That way, a publisher could allocate resources to fighting the kind of UGC spam that created the most disturbance to the user experience.
UGC Spam and User Experience
Past research has focused on identifying UGC spam. This research is different in that it attempts to identify the kind of spam that degrades the user experience the most.
The intuition is that if one can identify what affects users the most, a publisher can put more resources into neutralizing that kind of spam in order to improve the site experience.
This article examines what this research discovers and how it is relevant to you.
According to the research:
“The effort required to reduce the spam rates from 5% to 0% is much higher as compared to bringing it down from 10% to 5%.
…There is a business need for prioritization based on which spam types impact user experience negatively and invest time and resources towards building automated systems to tackle those.”
The research paper suggests that some kinds of spam can be safely ignored because the impact on the user experience is practically negligible. But as you will see later on, some kind of spam has a strong negative impact.
What Kind of UGC Spam was Researched?
The researchers studied five kinds of UGC spam.
According to the research paper:
“• Gibberish: e.g. asdsad jksjfs sdhd
• Irrelevant: e.g. Review of a movie for a gaming app
• Solicitation: e.g.Follow me on twitter @xxxx
• Abusive language: e.g.idiotic dirty morons
• Promotions: e.g.Instant cash discount, register now”
Results of the Study
Here are the results:
“…we notice that abusive words… had the most impact on user burden. Burden increased steeply as rates of abusive words spam increased.”
The researchers use of the word “burden” is a reference to one of three attributes they were measuring for (happiness, trust and burden).
The burden attribute corresponds to these statements:
“I found some of the content offensive
I found some of the content annoying
I found some of the content irrelevant to my task
I found some of the content inappropriate”
The UGC spam with the second most impact was promotional spam.
“Promotions spam was a close second leading to significant decline in burden across all three countries.”
Gibberish spam had the lowest impact on user burden.
“Gibberish spam had the lowest impact on user burden and did not change significantly with increase in rate of spam.”
UGC Spam and User Trust
Another interesting data point was, within the context of Google Play reviews, that the UGC spam didn’t impact user trust:
“…we observe that none of the spam types had an impact on Trust and the only exception being Gibberish spam in South Korea.”
One would think that seeing UGC spam on a page might cause a user to trust a page less. Why would users trust a page that contained UGC spam on it?
Could it be that users are able to differentiate the main content from the UGC content and thus compartmentalize feelings of trust between both kinds of content? In that scenario, a user would trust the main content and gloss over the UGC because it’s no longer trusted.
Obviously, UGC spam should be minimized. But in this scenario, Google’s research team discovered that the user experience would be better improved by focusing more resources on UGC spam featuring abusive language.
This leads to the second insight which is that there is value in identifying which kinds of UGC is having the most negative effect on user experience. Once identified, the business can apply resources in the most efficient manner, resulting in happier users.
A curious point was the observation of how for the context of Google’s research, spam didn’t affect user trust. It’s a surprising data point to consider, which may or may not be true for your situation.
In the video that accompanies the research, Dr. Sowmya states that a way she discovered to build trust in Google Play is to show the top positive review and the top negative review. Doing so, she states, gives the user a sense of the pros and cons of a product.
The video also shared that users will tend to be suspicious if all the reviews are five star. The Googlers insisted that it’s a sign of a healthy review ecosystem to show a spectrum of user reviews.
The goal of the HaBuT method is to identify what needs to be fixed. The insight is that the amount of negative UGC is not necessarily the criteria to use when deciding which kind to go after. The insight from the study is that it’s useful to research which kind of negative activity impacts user experience the most. The problem of UGC spam then becomes an issue of improving user experience rather than going after negativity for the sake of it.
Additionally, the video recommends user surveys to measure the happiness levels after the solution has been applied. This kind of data can demonstrate to the stakeholders that the initiative was or was not successful.
Read the research paper:
Spam in User Generated Content Platforms: Developing the HaBuT Instrument to Measure User Experience
Download the PDF here:
Watch Doctor Sowmya Karunakaran’s presentation here: