On the last day of GamesBeat Summit last week, leaders from Roblox, Fair Play Alliance, and Accenture discussed the issue of digital civility, and what challenges need to be addressed as games lean more firmly into the mainstream.
“If there’s ever been a silver lining to COVID, it’s been gaming,” said moderator Seth Schuler, managing director with Accenture Strategy. “We estimate 35% of the world’s population now plays video games.”
Accenture’s most recent gaming research found that gamers are spending 16 hours a week playing games, and another 14 hours a week engaged with others via social media across platforms like YouTube, Baidu, Discord, and Twitch. Nearly all gamers report that they game online to hang with their friends, meet new people, and during COVID, to have much needed social experiences. As a category, gaming is increasingly becoming a super platform for engaging people across entertainment activities and growing an ever-more diverse set of players.
For Roblox, socialization and player-created content is the core of the platform, arguably more so than in traditional games, so trust has had to be central from the start, says Laura Higgins, the company’s director of community safety and digital civility.
“Safety has always been our number-one priority,” Higgins said. “We’re a platform that was built around young people, and so those values, for us, that’s table stakes. My role has been to focus on creating healthy communities, positive experiences, and educating the community to then go and be good citizens elsewhere online.”
The intent is to teach life skills around kindness and empathy and teamwork, as well as conflict resolution, she added. That requires stringent rules and a multilayered approach to safety. That means a large team of moderators backed by AI and machine learning tools and chat filtering, particularly related to personal information in order to create a safe experience at the foundation. For parents, Roblox includes a suite of parental control tools, locked down with a PIN, to ensure that parents are confident their children are safe, as well as encouraging parents to actually spend time on the platform with their child.
As their community grows up with them, from kids to older teenagers and young adults, Roblox is intent on fostering a nurturing experience. To do that, one of your most important goals should be listening, Higgens said.
“This is my advice for any developers out there,” she explained. “If you can spend time with your community, you’ll learn so much. I’ve been an online safety professional all my life. There are certain things that I can take for granted that we as a company need to do to keep our community safe. That’s the basic. We start there. But there are certain more nuanced things going on with the community. It’s important that we listen and adapt to what’s going on.”
Accenture found that with folks spending more time online now, reports of bad behavior are going up — but one person’s bullying could be seen as another person’s rough play, Shuler said.
“One of the biggest questions for us is not just how we make games great experiences and how we have fun together, but how we look at these spaces and fulfill social needs and understand the breadth of needs and opportunities in these spaces,” said Kimberly Voll, co-founder of the Fair Play Alliance.
Game communication is very different from face-to-face interaction, even voice chat, with its lack of non-verbal cues and communication, she said, which can lead to mismatched expectations in your gaming experience, and lead to unexpected friction. The other challenge comes from game audiences increasingly crossing multiple cultures. Gamers don’t necessarily have the same shared background or the infrastructure of trust to rely on for successful interactions.
“A lot of the work we’re trying to do is to move us away from one size fits all, which is always the classic one size fits none,” she said. “These are spaces where humans gather, and where humans gather there is a full spectrum of behavior. Not everyone is going to get along with everyone else. What does that mean for how we’re making games?”
With some bad actors eager to enter the ecosystem, the challenge is building spaces that reduce the vulnerability in these communities as a whole, that make them healthier and more robust, and able to push back against these bad actors. And equally, foster resilience within individuals and help reduce their vulnerability as they create experiences.
“When we look at the root causes of why these behaviors emerge, when we know there’s a possibility of friction or mismatched expectations, we as game developers can invest in reducing the chance of that happening at the beginning, before a game gets off on the wrong foot, before it descends into frustration and folks start taking shots at each other,” said Voll. “In addition, [we must] take steps to understand how bad actors gain access to the system and operate within these systems, and do our best to reduce the chance.”
Individuals in a space can use social tools or opportunities to push back against harmful experiences, she added. Consequences are incredibly important; developers need to get better at detection and assessment of hate and harassment, and drawing strong lines in the sand — but the problem is much more complex. Developers also need to start investing in enriching spaces that foster successful interactions and successful coexistence, that speak to people’s need to connect and feel a sense of belonging in a space, wherever they come from.
The base case for technology in this space right now is AI, machine learning, and deep customer analytics for the end-to-end customer experience, said Christian Kelly, strategy managing director of internet, software and platforms at Accenture.
“Over time what you want to do is use machine learning and AI to understand the experience at an individual gamer level, so that you can reinforce the positives and you can take remediation steps on the negatives,” he said. “That’s a huge thing for all gaming companies to do from a centralized standpoint, and for the company to own.”
But there’s also decentralized technology coming out all the time, he adds, pointing to Temper, a tool developed by the Global Innovation Exchange at the University of Washington, which can be attached to a TV or monitor. It listens for things like hate speech and bad behavior, and will actually terminate gaming sessions.
“There are things, from a technology standpoint, that the industry can do, but there are also new innovations that are happening based on hardware, software, and cloud services that are going to enable parents to be more educated and do something about it in a decentralized way,” he said.
Higgins, who’s also on the executive steering committee of the Fair Play Alliance, noted that it’s the cooperation in the industry that will make great strides in addressing these issues as well.
“One thing that’s really joyful about working in the video game industry is the collaboration and the will to work together to solve some of these huge issues,” she said. “There are some wonderful conversations and sharing of best practices, of tools, big platforms making some of these tools that would have been inaccessible to small studios and startups, because they just couldn’t afford them, and so enabling people to use those for free. We know there’s a lot more of that’s coming in the pipeline as well.”
Overall, the industry must recognize, as a whole, it’s a shared community, and everyone has a duty of care to keep that community safe and healthy, no matter which platform players end up on.