Toxicity in Gaming Is Dangerous. Here’s How to Stand Up to It

What happens in video games does not stay in video games. Sometimes, this is a good thing: Decades of research suggest that video games often influence players in positive ways, such as increased psychological well-being, enhanced problem-solving and spatial-rotation skills, and even increased interest in STEM fields. But too often, these benefits of video gaming are counteracted by rampant toxic behaviors.

WIRED OPINION

ABOUT

Rabindra Ratan is an associate professor in Media & Information at Michigan State University, Cuihua Shen is an associate professor in Communication at UC Davis, and Dmitri Williams is an associate professor of Communication at the University of Southern California. The three have coauthored multiple publications in the field of game studies.

In the world of online gaming, this includes sexual harassment, hate speech, threats of violence, doxing (publicizing others’ private information), spamming, flaming (strong emotional statements meant to elicit negative reactions), griefing (using the game in unintended ways to harass others), and intentionally inhibiting the performance of one’s own team. Perpetrators of such behavior tend to be younger, male, and high in emotional reactivity and impulsivity. These disinhibited behaviors are fueled by the anonymity some virtual environments afford and by players seeing them as widespread and acceptable behaviors, which might help explain why toxicity is somewhat contagious—exposure in previous games has been shown to increase the likelihood that a player will commit toxic acts in future games.

Players often rationalize such toxicity as a normal part of gaming. But, as new research shows, this behavior has significant, long-term negative effects on players, especially those who do not fall into the stereotypical gamer demographic of young white males. Despite studies suggesting that women are equally or more skilled than men in video games when given the same amount of playing time, they are more likely than their male counterparts to be targets of toxicity. And studies suggest toxicity is more harmful to women, not only with respect to psychological well-being but also because certain coping mechanisms—like not using voice chat, to hide gender—puts women at a disadvantage within the game itself. Such experiences discourage women and girls from playing, which means they are less likely to gain the cognitive benefits of gaming such as spatial rotation skills, which are associated with success in technological career paths—an area in which there is already rampant gender disparity. Studies also suggest that exposure to gender stereotypes within games potentially causes negative attitudes about women in other stereotyped domains, such as STEM fields.

Although research tends to focus on gendered toxicity, minority groups are also frequently victimized. According to a recent survey from the Anti-Defamation League (ADL) of 1,000 US gamers ages 18 to 45, more than half of multiplayer gamers reported harassment related to their race/ethnicity, religion, ability, gender, or sexual orientation in the previous six months. This study also found that roughly a third of LGBTQ, Black, and Hispanic/Latinx players experienced in-game harassment related to their sexual orientation, race, or ethnicity. And 81 percent of multiplayer gamers overall experience some form of harassment, the majority of whom also reported experiencing physical threats, stalking, and sexual harassment. Further, 64 percent of respondents felt that toxicity impacted them, with 11 percent reporting depressive or suicidal thoughts, and nearly a quarter saying they had quit playing certain games as a result of these negative experiences.

The games industry is aware of this issue, and some major companies such as Electronic Arts, Infinity Ward, and Valve have launched anti-toxicity initiatives in response. These programs seem to align with the ADL study’s suggestions: They’ve developed moderation tools for voice chat and improved the ease and transparency of player-reporting systems. Still, the ADL suggests that systemic change will be possible only if other stakeholders, such as civil society organizations, also dedicate resources to this issue. Take This, a mental health nonprofit organization that focuses on supporting the games industry and community, and the Fair Play Alliance, a games industry coalition that shares best practices in supporting healthy player interactions, are doing just that. Governments, too, should get involved, according to the ADL. Legislation targeted at curbing misinformation and divisive speech within traditional social media platforms, for instance, could be expanded to address online gaming toxicity.

But aside from all of these organizational approaches, the most effective way to curb toxicity in online gaming starts from the bottom up, through individual actors who actively confront such behavior. Of course, this is easier said than done. In one study, more than three quarters of college-age gamers reported that racist, sexist, or homophobic comments in online games should be confronted, but less than a fifth of those individuals reported that they actually do. Another study, conducted in collaboration with WIRED, found that people who support the Black Lives Matter movement are also likely to stand up to bullying and harassment online, but only a small portion of participants said they do so to a strong extent. Similarly, in the ADL study, fewer than half of respondents said they reported toxicity using in-game tools. The reasons for not reporting included the effort required in the reporting process, reports not being effective or taken seriously, or toxicity being a normalized part of the play experience.

Source

Author: showrunner