Trust and Safety in Gaming
- Trust and Safety in Gaming: A Complete Guide
- Define what you want people to do in your ecosystem
- Shape your culture proactively
- Find the right balance between automation and human involvement in content moderation
- Combine technology + humans for effective fraud prevention
- Ensure your outsourcer the right agents for your content moderation or trust & safety team
- Be available whenever and wherever your users needs help
- Work with your players to constantly evolve your process
- Trust and support is easier with the right partner
Trust and Safety in Gaming: A Complete Guide
The long-term goal of every gaming studio is to create an enjoyable and inclusive environment for players. One which keeps your players engaged and coming back for more.
And an essential part of successfully doing this is to build a robust trust and safety program.
It’s how you do your best to ensure your players are protected from things like fraud, spam, offensive content, harassment, hate speech, and grooming.
In the online gaming industry, this has never been straightforward to begin with. But nowadays, it has become even more challenging due to an increase in:
- A larger pool of players with different cultures, languages, and backgrounds
- The prevalence of unstructured content, customization options, and UGC content
- Interactions between your players happening across multiple channels outside of your game’s ecosystem
Define what you want people to do in your ecosystem
“The first question I always ask is — what is your code of conduct?”
The first step towards building a trust and safety program for a gaming studio is to ask yourself: “What exactly do we want players to be doing in our ecosystem?”.
For example, if your game is designed specifically for children, you wouldn’t want anything in there even bordering on a gray area.
However, if your game is more adult-oriented and deals with themes like violence, war, or politics, you’ll need to have a different set of standards for what’s permitted.
At the same time, you’ll also need to understand the different settings in which your players are interacting with each other. This allows you to pinpoint areas that could potentially become toxic and address them proactively.
This exercise enables you to expand on the various ways your game can be used, giving you the chance to better define how you expect players to behave and what action(s) you’ll take if they don’t.
This will form the foundation of your code of conduct, which could look something like this:
Your code of conduct will help set early expectations for player behavior and a framework for moderation,which ultimately fosters a more positive environment for everyone involved.
Here are some best practices to keep in mind when creating your code of conduct:
- Use clear, specific language
- Establish a transparent appeal process
- Provide descriptive examples of positive and negative behavior
- Involve your community in the development and be receptive to feedback
- Decide how you’ll educate players about the code (tutorials, in-game prompts, etc.)
Free resource: Just starting out and building your code of conduct? Check out our customizable gaming code of conduct template! Make a copy and use as you please.
Shape your culture proactively
Defining your code of conduct is a great starting point for your trust and safety program.
But if you don’t follow up on it, it’s not worth much more than the paper it was printed on.
Here’s how Mike puts it “Your culture is going to emerge, in some sense, on its own. So if you want it to be a certain kind of culture, you need to be proactive in shaping that.”
So for starters, your code of conduct needs to have company-wide buy-in. You want it to reflect your company culture and serve as the guiding light for all your decisions — from game design to customer support, and everything in-between.
For this, you need to have every department aligned on what you’re trying to achieve and how you want to achieve it.
If you use a business process outsourcing (BPO) partner to grow and scale your trust and safety team, they also need to share your culture. When interviewing BPOs, ask them about their core values and how they view the role of trust & safety in gaming.
Just like with your own team, be equally transparent with your players about the kind of experience you’re trying to foster.
Most toxicity in gaming isn’t caused by someone with bad intent. Players aren’t intentionally creating problems and using your game to spread chaos and negativity.
Toxicity is usually caused by a very small minority. In fact, less than 3% of users are responsible for 30% to 60% of toxic content across all types of platforms.
When toxicity occurs, it’s usually due to a lack of understanding what’s expected, whether that’s due to things like their early experiences in the game or simply a clash of cultures.
For example, let’s say you’re a kid playing a game for the first time and you’re matched with a rowdy crowd where players are using toxic language to trash talk one another.
There’s a good chance that you’ll presume that’s what is expected of you and you’ll start to behave in the exact same way.
But as Mike summarizes in another interview, “There are extremely few people in the world who are the villain of their own story.”
Players aren’t usually intending to join your game and wreak havoc on the community, which is why player education is so important. And if done right, your players will respond to it. This statistic from Apex Legends illustrates just that — 85% of players who received feedback changed their behavior without the use of bans.
Educating your players about your code of conduct can take many different forms. It could be through:
- Forums and wikis
- Email, social media, and video campaigns
- In-game education like loading screen tips or pop-ups
- Community challenges or reward systems that promote positive behavior
Rec Room, for example, educates its players through interactive tutorials and in-game signage, in addition to the code of conduct being easily accessible on their website and Youtube.
Make sure your content moderation agents – whether outsourced or in-house – have all of these resources, as well as macros and templates for explaining the code of conduct to players. The trainers on your outsourced support team can create quizzes to ensure agents could recite your code of conduct in their sleep.
You’ll need to experiment with different touchpoints and find the most effective combination of methods for your audience.
Once you’ve done this, you’ll need to demonstrate that you’re taking consistent action to realize your vision. This can be through:
- Content moderation and consistent enforcement of rules
- Publicly talking about good versus bad behavior
- Your own corporate actions
You want to do everything in your power to reinforce that trust and safety are real and important. An ideal end goal would be having it so deeply ingrained that players start to moderate one another and encourage appropriate behavior without your prompting.
Find the right balance between automation and human involvement in content moderation
As we touched upon earlier, it’s extremely important to consistently enforce your code of conduct to demonstrate you’re taking your commitment to trust and safety seriously.
But this can be hard, because your players won’t always report issues — even in more severe cases.
Plus, the sheer scale of data you’ll be dealing with makes it impossible to do manually. No matter how many pages of documentation you write or how well your team is trained.
This is where technology and automation comes in. It allows you to leverage tools with artificial intelligence and machine learning capabilities, to help you review content at scale.
These could be tools that analyze patterns in user behavior, transactions, and previous account activity to detect different things like referral fraud, payment fraud, account takeovers, and in-game cheating.
Or ones that analyze content and community interactions to detect harassment, toxicity, and inappropriate content.
The right tools can bring tremendous results.
But you shouldn’t expect automated systems to do everything for you.
A lot of cases you deal with won’t be black and white. You’ll still need human moderation in cases where understanding context and nuance is important.
Also, it isn’t unusual for your automated systems to detect false positives. And again, you’ll still need human moderators to validate this, including sometimes taking action to make things right.
For example, one of Peak Support’s clients has a policy against usernames that are inappropriate or sound like racial slurs. Sometimes, users get flagged for usernames that sound like racial slurs in English, but are actually real names in other languages.
Need help with content moderation without having to recruit full-time agents?
Peak Support has professionals with experience in the gaming industry to hit the ground running for you from day one.
Combine technology + humans for effective fraud prevention
We’ve talked a lot about moderation, but fraud prevention is another critical piece of building a safe environment.
Gaming companies face constant challenges from fraudulent actors, including:
- Account takeover: Hackers gain access to player accounts to steal in-game items or make unauthorized purchases.
- Phishing scams: Fraudsters send deceptive emails or messages pretending to be from the gaming company to trick players into revealing sensitive information like login credentials or credit card details.
- Chargeback fraud: Individuals misuse stolen credit card information to make in-game purchases and then initiate a chargeback with their bank, leaving the gaming company to absorb the loss.
- Bonus abuse: Players exploit promotional offers or bonus systems by creating multiple accounts, sharing promo codes on social media groups, or using manipulative tactics to gain excessive benefits without legitimate gameplay.
- Fake online marketplaces: Fraudulent websites that claim to sell in-game items at discounted prices but never deliver the goods after receiving payment.
Combatting fraud at scale is incredibly challenging, which is why using the right technology is key.
Identifying and preventing these tactics manually would take a herculean effort and a huge team, which isn’t financially feasible. Instead, gaming clients use technology to identify account takeovers and suspicious logins, flag repeat offenders, and surface potential problems ahead of time.
Modern fraud prevention technology can analyze player behavior, transaction histories, and actions that deviate from a player’s normal activity — such as sudden spikes in purchasing or logging in from multiple locations.
Then a team of experienced agents review the flagged transactions to detect whether fraud has occurred.
Each of these signals creates an opportunity for human agents to investigate the situation further. Human agents are better at detecting nuance and using their critical thinking skills to make informed decisions. They’re also far better at showing empathy if (or when) something goes wrong.
By combining smart technology and capable, experienced humans, you’re able to build a much safer platform for your players.
Fraud prevention tools that can be a good fit for gaming platforms include:
Some gaming companies also choose to build fraud prevention tools in-house.
For example, when Valve implemented its deep learning anti-cheat system (VACnet) for the game CS:GO, conviction rates for blatant cheating increased to 80-95% as compared to 15-30% on human submitted cases.
Ensure your outsourcer the right agents for your content moderation or trust & safety team
When choosing a business process outsourcer (BPO) for your trust & safety team, ask them how they hire great agents. Anti-fraud agents typically have different skills than content moderators.
Some of the characteristics to look for in fraud prevention agents include:
- Attention to detail
- An inclination for detective work and problem solving
- Excellent soft skills, as these are sensitive issues to manage with players, especially players who have been flagged incorrectly
Content moderators also need to have excellent soft skills. Content moderation cases are highly sensitive, and tend to have low CSAT, since most players touched by this team are being punished or even banned for bad behavior.
Agents with excellent communication skills can help smooth the process, particularly for the false positives. With those players, it’s important to smooth things over, apologize, and get them playing again as soon as possible.
Content moderators also need a high level of efficiency and focus – they may be processing 40 or more cases per hour, compared with 8 to 12 for a typical customer service agent.
Finally, consider the emotional burden of content moderation work, particularly if agents will be dealing with potentially racist, sexist, or even violent content. Make sure agents are aware of what they’re facing and understand the risks.
If you’re using a BPO, make sure the outsourcer also offers the necessary support to protect their agents’ mental health. Full health benefits, paid time off, counseling options, and necessary mental breaks are all important components of a well-managed content moderation team.
Be available whenever and wherever your users needs help
Games have always been social. They give people an opportunity to communicate with others and build long-lasting emotional bonds.
And over the past decade, it’s only gotten increasingly social.
Games have transformed from curated experiences to venues where people from around the globe gather. And things like voice chat have only facilitated even more room for live interaction.
These interactions are not just limited to your game but extend to communities on other platforms, like Discord and Reddit.
While this is great for you as a gaming studio, it also means you need to be on top of things that extend beyond your gaming ecosystem. Because anything which has the potential to create bonds also has the potential to be misused by bad actors.
And as Mike says “A good community process doesn’t put the burden on the victims to fix things.”
So what can you do about it? Here are some ways to make your customer support more accessible:
- Develop a comprehensive knowledge base that addresses the common issues
- Implement in-game support features for players to seek help without leaving the game
- Employ an omnichannel structure, handling channels like live chat, email, social media, community forums, phone support and web-based ticketing systems
- Encourage peer-to-peer support by rewarding helpful community members
- Provide localized support and implement a follow-the-sun model if you’re dealing with a global audience
- Gather feedback and analyze your support tickets to update your support process and game design
While the idea itself is straightforward, implementation is anything but. Because it requires significant time, resources, and effort.
This is an area where outsourcing your customer support can work wonders. It’s a flexible and scalable way for you to be there for your players without having to take away focus from building games.
A good outsourcing partner won’t just serve as the frontline of your support but will analyze the data they deal with to help you plug gaps and make improvements in your processes.
Work with your players to constantly evolve your process
If you think you can just draft a code of conduct and expect miracles, you’re in for a rude shock.
“There are some big misconceptions when it comes to trust and safety. The first and maybe broadest is the belief that there’s this one weird trick that will get you done with this, and then you can just move on with your life.”
Trust and safety in gaming is an ongoing process — just like your user acquisition strategy, building your software architecture, or any other part of your business.
Approaching trust and safety as a ‘one and done’ strategy will likely end up hurting you even more in the long-run.
For example, let’s say you notice a lot of toxicity coming from your chat. And you try to address this by making in-game chats harder to participate in.
What happens next?
Your players are just going to find a different place to chat — probably one with even less regulation. But when things go south, you and your community will still be held accountable and forced to deal with the backlash, even though you tried to wash your hands off it.
So instead of shying away from it, you should view trust and safety as an opportunity to shape the vision you have for your game.
And this goes for all games.
Mike talks about an A/B test he worked on with Call of Duty to measure the impact of content moderation. Within just a month, the moderated space had a 30% higher retention rate as compared to the undermoderated one.
That is a staggering number for a mature-rated, highly competitive game with an established and loyal fan base.
Which just goes to show that no matter your game, your audience, or how long you’ve been in business — if you implement changes while keeping your community as the focus, they will be well received.
“This is not a cost center. It’s a part of your customer acquisition strategy. The studios that understand that better are ultimately seeing more and more success these days.”
Keeping your community as the focus means empowering your players, giving them a chance to voice opinions, and taking action to show that they’re being heard.
Other ways to involve your community in shaping your ecosystem could be through:
- Surveys
- Feedback sessions
- Public beta testing of new policies
- Recognizing and rewarding community leaders
- Sharing reports on moderation statistics (bans, improvements made)
Trust and support is easier with the right partner
So, there you go.
These are Mike’s (and our) top tips for you to curb toxicity, prevent fraud, and create an inclusive & positive environment for your players.
While it may seem a little overwhelming, the rewards are well worth the effort. The impact of investing in trust and safety early will only compound as time goes on. So you want to get on board as soon as possible!
If you need help with implementing a robust trust and safety program for your business and you’re looking to work with a team who:
- Understands the unique challenges of the gaming industry
- Provides multilingual support across multiple different channels
- Can build or optimize your tech stack to deepen connections with players
- Develop a flexible, scalable, and sustainable way to curb fraud and toxicity
Last but not least, a huge thank you to Mike Pappas (you can follow him on LinkedIn here) for sharing his invaluable insights for this article. We’ll leave you with one final quote from him:
You’re trying to do this complicated thing that has a lot of different moving parts. How do you get started?
The most obvious answer is — you hire an expert.
Ready to Secure Trust and Safety?
Stay ahead with strategies to build and maintain trust and safety in gaming!
Ready to chat with us?
We’ve helped dozens of innovative companies launch and scale their customer service teams. Whatever you need to grow your business, our flexible offerings can fit. Let’s chat about how outsourcing can unlock new levels of growth for your business.