dorado, overwatch

How Can Overwatch Deal With its Toxic Players?

If you’re playing any online game, you’ll run into toxic players. This is an accepted inevitability. But it shouldn’t be.

Overwatch in particular is intentionally welcoming and joyful. It’s about hope and heroes in a colourful world that doesn’t take itself too seriously. Plus, Blizzard has intentionally focused on inclusivity through its characters. And so it is a fun place to hang out – most of the time.


Hate speech. Slurs. Threats. Even the “casual” hatred of players who are perceived to be performing badly can sour a game. From my experience and a survey I conducted of 165 players, it’s happening in approximately a quarter of matches. It’s a systemic problem that Blizzard must fix.

[Content warning for in depth discussion of threatening and hateful speech including bigotry of all kinds, suicide and rape threats. Slurs censored by the author are adapted from real life examples.]

Overwatch is a competitive game. It’s understandably frustrating when you feel your team is letting you down. I somewhat reluctantly main Mercy since healers are such rare picks and, sure, I’ve been annoyed by demanding teammates. But there’s a line between a little bit of personal salt and outward toxicity. And an all-too large minority of Overwatch players have left that line so far behind that it’s out of sight.

Toxicity most harms the vulnerable

It is important to note that, whilst toxicity can be aimed at anyone, it is all too often built upon oppressive social constructs. A toxic player might lash out at a teammate’s supposed bad aim by claiming they must be an ablest-slur-transphobic-slur or telling them to kill yourself, racial-slur-homophobic-slur. In other words, though these attacks are against an individual whose identity may be entirely unknown, they are denigrated by assuming that they must be part of a marginalised group.

Of course, this also affects those whose identity is known (or assumed). Those who are perceived as women, LGBTQ+  people, or people of colour based upon their voice or in-game handle are often attacked before a match even starts.

Even characters are not exempt. Those who main Lúcio told me that they and he were too often mockingly referred to by the n-slur. Choosing Sombra was also reported to lead to anti-Latina racism and racialised misogyny.

lucio and sombra are often abused by toxic players

Make no mistake about this: the majority of toxic players are not just rude. They are bigoted.

This isn’t okay

Overwatch should not to be a place filled with rudeness, but it is infinitely worse that it is now an environment where oppressive behaviour is rampant. It is not merely unpleasant but directly harmful.

If you’re thinking “so what?” or that people should just grow a thicker skin, I can’t explain to you why you should care about people. That is, and always will be, the main reason to combat bigotry anywhere. But to think for a moment in purely mechanical and economic terms, toxic players also harm Overwatch as a game in a number of ways.

Toxicity affects how we play

A huge number of people (including myself) also avoid using voice chat. Less people feeling comfortable using mics means less cooperative teams. Moreover, it is marginalised people who are more likely to take this step, making Overwatch’s player base feel more monolithic than it truly is. This in turn feeds into toxic players feeling safe in spouting bigoted views, in a vicious cycle.

Toxicity also often leads to people taking long breaks after instances of abuse or simply when they are too fed up to continue. Some have quit playing all together. This hurts Blizzard’s profit, since people who aren’t playing cannot contribute to the microtransaction economy.

Toxicity makes us play worse

A common excuse for harassment is that someone is playing a hero considered worse according to Overwatch’s rigid meta. Players reported that they therefore avoid certain heroes even if they prefer playing them or are better at them. This leads to boring, samey games – even in Quick Play, which is supposed to be a place to experiment and learn new heroes. Moreover, this rigidity, upheld by harassment and fear, is actually harmful to teams’ chance of winning when switching according to context rather than the sterile meta picks would genuinely counter the opposing team.

Finally, whilst toxic players are usually raging “because” they want to win, most people will play worse when they are uncomfortable. They may also take time away from the match in order to block and report the offenders. Some refuse to cooperate with a toxic teammate; setting their own boundaries and instituting their own punishment. In this way, abusing a fellow player for their perceived lack of skill or non-meta hero pick is entirely counterproductive.

In fact, one person theorised to me that this means that toxic players lose disproportionately compared to their personal skill, since they are unable to work in a team. This in turn makes them ever angrier – another vicious cycle.

Toxic players are not inevitable

There is a certain sentiment that pervades discussions like these: a sense that this kind of problem is unavoidable. I understand the inherent defeatism that underpins attempts to combat toxicity. It can feel like it’s baked into the very fabric of the game.

But it’s not.

Overwatch is essentially a public space. From a lobby of thousands, twelve people come together to cooperate and compete. They are, for the duration of a match, occupying the same space, digital or no.

Think about other public spaces. Think about why you don’t (usually) get people walking into a café and screaming rape threats at a f-bomb lesbophobic-slur as soon as she says hi.

I’m far from the first to suggest this, but it’s time we started treating digital spaces the same as we treat meatspace ones. (I avoid using “real life” here because the internet is a part of real life; separating those spaces facilitates this kind of abuse.) To put it simply, this means finding a way to stamp out the vast majority of the toxicity, once and for all, no matter how difficult that is.

What does Blizzard already do?

No one knows exactly how Blizzard deals with toxicity. It is somewhat understandable that they would want to keep some things intentionally obscure to prevent those self same toxic players from abusing the system, but this also means that harassed players feel unsupported. Nonetheless, some things are evident.

Firstly, there are options for limiting your exposure to toxicity. Obviously avoiding voice chat comes naturally to most who are fearful. For text chat, the optional profanity filter catches most slurs, blocking players prevents you from seeing what they write, and chat can be turned off completely.

Secondly, Overwatch does have a code of conduct. It is brief, but regarding player-to-player communication, it states:

 “You may not use language that could be offensive or vulgar to others.

Hate speech and discriminatory language is inappropriate, as is any obscene or disruptive language. Threatening or harassing another player is always unacceptable, regardless of language used. Violating any of these expectations will result in account restrictions. More serious and repeated violations will result in greater restrictions…

We reserve the right to restrict offending accounts as much as necessary to keep Blizzard games a fun experience for all players.”

Thirdly, improving the reporting system has been on Blizzard’s radar since at least July of last year, when Jeff Kaplan announced that there would be “under the hood” tweaks to combat the issue. With the latest patch they released another set of reporting tools (which had previously been in testing on the the Playable Test Region (PTR)) in another attempt to stem the flow of hatred.

What should Blizzard do better?

Nonetheless, each of these have problems. Leaving players to protect themselves from toxicity places a burden on the victim, affecting their gameplay experience. Avoiding voice chat prevents the efficient teamwork that is necessary for playing well. The existing profanity filter misses, for example, rape threats or urging other players to commit suicide. Blocking players does not mean you don’t play with them, only that you can’t read what they write. And people must manually disable chat at the beginning of every game; there is no option to toggle it off for longer periods of time.

The code of conduct is a nice sentiment, but is vague in every part. Swearing is using “language that could be offensive or vulgar to others”. So is the word “moist.” Definitions and clear boundaries are vital in a code like this, and Blizzard has none.

Moreover, whilst they state that they will “restrict offending accounts as much as necessary” they appear not to prevent consistent offenders from playing their game.

This is where the vagueness of the system makes it difficult to know for sure, but whilst toxic players often report being silenced from communication, suspensions or bans from playing entirely seem incredibly rare, if they ever happen. There are many forum posts and reddit threads raging about bans for cheating or consistently leaving games, but none mentioning toxic behaviour. It is well known that any single instance of cheating can lead to an instant ban, but it does not seem that even provable sustained abuse does the same.

This is an inherent value judgement on Blizzard’s part that states that cheating or leaving ruin the game for other players more than toxicity. And while developers do not take toxicity as seriously as possible, it will always be exponentially more difficult to eradicate.

What is Blizzard trying next?

As I said, Blizzard are continuing to making changes to the reporting system. Most people I asked were optimistic or even happy about the system that was just released (though at the time of asking it was on the PTR only). Unfortunately I fall in with the (large) minority that don’t see how it will help. Certainly, it’s great to see that Blizzard continually looking to this issue, and it does provide some definitions that narrow and clarify what abusive chat is. (“Abusive chat is any form of hateful, discriminatory, obscene, or disruptive communications, threatening or harassing another player.”) But, since it lacks a solid foundation, the action of sending a report appears inherently ineffectual.

It seems reports lead to action based upon volume alone. Apparently, if a player receives many reports against them for any reason, say simply for playing Widowmaker, they can repeatedly have their chat temporarily silenced regardless of the realities of the situation.

This means that if few people report a toxic player, they are unlikely to receive even the mildest sanction of a chat timeout. And the complete lack of feedback means that reporting is such an unsatisfactory experience that many do not bother.

Again, it’s somewhat understandable that Blizzard would not want the intricate workings of its reporting system (assuming that there is such a thing) on the internet lest toxic players learn to subvert it. But as it stands, the most common phrase used to describe it is “screaming into the void.” And besides – reporting only exists on PC versions of Overwatch. Console players are left with even less.

More potential for subversion?

This new system seems even riper for abuse. One category is “poor teamwork” which – though Blizzard does clarify that this has more to do with “constantly communicating in a negative fashion (i.e. “this team is horrible”)” – seems to conflate perceived poor play than the much larger problem of bigotry.

Almost certainly toxic players will do so. Will anyone at Blizzard actually notice if someone is reported for picking Hanzo instead of abusing their team? Do they have a team checking chat and voice records for real information? How does my favourite anti-toxicity tactic – refusing to heal anyone who says a slur – fit into this? Maybe it is poor teamwork, but it’s also a diagetic and instant sanction, something that Blizzard doesn’t seem interested in applying.

Besides, if I politely tell someone not to call me an AIDS-ridden sexist-slur, will they pause to read the helpful explanation of what “poor teamwork” or “abusive chat” actually means or will they just report me out of spite? Will Blizzard be able to tell the difference?

What else can Blizzard do?

I’ve mentioned that Blizzard is cagey about revealing the intricate workings of their system to prevent their subversion, but other games are clearer. For example, League of Legends shows a player a record of the offence that lead to the sanction and quickly escalates to total play suspensions and even permanent bans. It is both more transparent and more heavy-handed than Overwatch’s system. It looks, in short, better.

Yet League is still – ahem – legendary for its toxic players. Again, there is a sense of inevitability. So what can developers do?

Frankly, no one knows. But there is a wide swathe of options that Blizzard – and other companies – have been slow to experiment with. If many of these multi-billion dollar corporations took toxic players more seriously, we would at the very least have more data on what works and what doesn’t. As it is, we only know that the current tactics are ineffectual. But any changes to these tactics remains slow to come.

We have to get serious

Let me preface by saying that I hope this is not the only solution that works. I want a lesser measure to stop toxicity, if only because it would restore some faith in humanity. But, ultimately, there is no reason that slurs should be allowable at all in text chat. (Nor in voice chat, but this is trickier to police.)

Certainly, free speech is important, but by enabling the ability to call a teammate a slur, Blizzard is damaging the experience for the five or eleven other players in the match, who equally have the right to play without being harassed. Plus, recall that Blizzard already employs a filter that forcibly changes “gg ez” (good game, easy, a common phrase used to condescend the opposing team) into a variety of other messages that often instead demean the speaker, such as “I’m wrestling with some insecurity issues in my life but thank you all for playing with me,” and “I feel very, very small… please hold me…”. This was accepted in good faith by the community: preventing a player from inciting hatred ought to be no different.

Filtering or outright banning slurs will harm no one. Not being able to call someone a slur is not  harm. If you are worried about a slippery slope, I sympathise, but I would rather focus on the real existing issue than a potential future one. And by, essentially, putting the existing profanity filter (expanded to include what is currently missing, like rape threats) in place for all players, toxicity would ultimately be eradicated. Yes, players could simply enable the existing filter themselves, but preventing players from doing it in the first place is what will undeniably enforce the rules that Blizzard themselves spell out in their code of conduct.

I hope we can avoid going this far

This is not the only potential solution. Cafés do not employ speech filters (especially since they cannot), yet there are real consequences for yelling hate speech in one. You would likely be banned by site staff immediately and could even be prosecuted. This is something that Blizzard can and should apply more forcefully. Silencing bigots for perhaps a day at a time is clearly not a worthwhile deterrent. As per their code of conduct, they must find what is “necessary to keep Blizzard games a fun experience for all players.”

Again, we do not know what will work in this respect because Blizzard – and other companies – have not tried it. Diagetic sanctions may be the answer. For example, they already use reduced XP as a punishment for those who leave competitive matches before the time is up. Many think that simply making a strong public statement would help, if only to support those who are harassed. I also believe that a set of guidelines the game shows directly to players periodically, written with the help of those most marginalised by Overwatch’s harassment, would also go a long way in terms of a show of good faith, even if players might choose to ignore them.

Any solution will need high investment

Potentially, team who reads chat logs or hears audio from the time of a report would entirely erase all the problems that myself, other players, and Blizzard all fear when it comes to abusing the reporting system. For example, they removed the ability to avoid people after players abused it to avoid excellent opponents – this system could return if a team monitored it. Not to mention that we would finally be able to have transparency about how any of this works.

I do not pretend to know what this would cost. I would believe it if it was genuinely prohibitively expensive. But even if toxic players weren’t costing Blizzard money (hint: they are) even a partial implementation of this – for example, using a randomly selected small percentage of reports or having an algorithm scan sections of chat from the time of a report for slurs – would be a direct commitment to real inclusivity from a multi-billion dollar corporation who claims to value it.

In summary…

Toxicity has its roots in bigotry, and spreads it in turn. This is something that must end. Ineffectual systems that do not quell toxic players are so entrenched that we cannot know what the real solution looks like – yet. But two things are clear: a solution is necessary, and we must find it outside of the structures that have thus far failed.

Marginalised folk – and all players – deserve a game without toxicity, and so they deserve developers who will work outside of the established box to create that. And Blizzard is in a great position to lead the way, if they’d only try it.

The following two tabs change content below.

Jay Castello

Jay is a freelance games writer specialising in intersectional feminist critique, how to improve games and use them to improve the world, and cute dogs. She loves inhabiting digital spaces in all their forms, and being constantly surprised by just how weird and wonderful games can be.


8 responses to “How Can Overwatch Deal With its Toxic Players?”

  1. TwoLiterSoda Avatar

    I’ve run into this many times playing Overwatch, mainly because I play characters I enjoy but others perceive as bad, not only that but also because I stink haha. I play for fun, pick characters I like, and generally have a good time but when teammates throw hatespeach at me, curse me out for being a woman, and so on, it’s a bit much. Players have to realize it’s a game, is meant to be enjoyed by all, those same players need to grow up and mature emotionally because obviously something is wrong if they feel it’s ok to act as they do.

  2. The subspecies that occupies this game was the reason I stopped playing it after 3 days. It was easily to spot early on that Overwatch was going to become a hostel for the toxic. Come over and play The Division – best community ever

  3. Brendon Holder Avatar
    Brendon Holder

    This is a tough subject and relevant to what is happening around the world. Hate speech is awful but we have to be careful of “banning” anything. Banning something never works for one thing and even if it did do we really want to live in a world where freedom of speech is monitored? Yes people abuse this privileged just like people abuse most privileges but taking away that freedom will have even more serious repercussions. The only way to control this is to give the individual being offended to mute the person offending them. Giving them the freedom not to listen. Allow us to right click on a characters portrait and choose “ignore”. This should stop any text they type being seen by the offended and mute their speech to the offended too. Problem solved. These disgusting cowardly people only get strength when offended people respond. Don’t respond.

    1. “Freedom of speech” is your right to not be censored by the government. It has no real relevance in a discussion about anything that doesn’t actually include government censorship. If Blizzard decided to censor what you say, they are not impeding your rights. Frankly I find bringing up “freedom of speech” as a sliding slope to tyranny every time policing bigotry is brought up (almost always) to be histrionic. It’s called accountability, and kind of essential in healthy communities. You’re an asshole, you get the door. Something many online communities haven’t cottoned onto yet.

      While I do agree to some extent that engaging with these hateful douchecanoes feeds the flames, the fact is that ignoring it doesn’t actually _solve_ the problem of them being there and behaving that way in the first place. Yeah I guess we could have an ignore function, but I’d still have to see that trash in the first place? And then you have to consider- these people aren’t a loner in the room that you can effectively shun through silence. Because TONS of people play Overwatch, and a lot of the times they meet likeminded individuals and they get to engage in their shitty bigotry together. They don’t stop doing what they’re doing. What you’re suggesting is very much “out of sight, out of mind” and that’s NOT a good way to deal with toxicity.

  4. asadachi Avatar

    I don’t agree with when people use derogatory hate speech to lash out at others, but I have to say it: some people are more sensitive than others. If you like playing overwatch, can’t you turn off voice chat / only play with friends? Is it they’re sending you a text chat for the PC players? Well, if that is the case, Blizzard should allow you to report such issues with a quick screenshot. Not saying the hate is right, just saying it’s only words and should be able to be avoided I would think.

  5. Gizensha Avatar

    The solution is simple, and sadly something that no company will ever be willing to implement because doing so would be expensive.

    Stop relying on automated systems. They don’t work for this. We all recall games of with text based chat where people couldn’t discuss mountains, but that didn’t stop anyone from using the racial slur that filter was an attempt to block if they wanted to – 1337 was a thing, and while people by the early 00s stopped using it by habit for the most part and only tended to use it ironically and for emphasis, it was still an option to evade automated filters (Which was particularly obnoxious in that game since there were genuine reasons to discuss mountains in it since it was a worms-like game where terrain, such as mountains, mattered, this wasn’t just people doing the equivalent of joking about not being able to talk about Scunthorpe on a forum with a language filter – I’m genuinely curious if that sentence is going to get through Discus unscathed, not having a clue what, if any, language filters Discus has) – Or, at least, those of us who are old fogies who were playing multiplayer online games in the early 00s with friends we met on IRC recall such systems.

    Ultimately, public games on Blizzard’s servers are privately owned public space, and private games is the equivalent of renting out a function room – you’re being given someone else’s space to use privately for a period of time. How does a cafe or pub deal with people sprouting hate speech, or otherwise being disruptive, on their property – the closest analogy, as you use in the article? The manager either kicks them out because they’re disturbing other customers, or lets them be at which point the other customers are likely never to come back. The manager knows they’re being disruptive not by other customers reporting them, but by their own staff – or themselves – observing the behavior. In publicly owned public space – the equivalent of a pavement or a public park – with no such staff, problems tend to be more rampant, because there’s no real way of dealing with the situation, and no-one is constantly observing everything going on. There are theoretical ways of dealing with particularly nasty disruptive elements – reporting harassment and other minor crimes such as graffiti to the police – but it’s often more trouble than it’s worth and unlikely to actually amount to anything being done.

    What automated systems, and ones tied to player reporting is doing is trying to deal with disruptive behaviour in public space on private property using tools to deal with it in public space on public property, and that simply does not work. And as such what needs to happen is human moderators, not necessarily working in real time, but some on hand monitoring the various chat systems on a random sample of games and who can look into reports as they come up. Which no company is ever going to do because that would require actually hiring humans to do something automated systems are, at best, highly gameable on and at worst actively prevent the wrong behaviour (The Scunthorpe problem vs 1337 for actually wanting to use slurs), rather than just automating everything and pretending the right combination of algorithms will actually work rather than hiding wedding vows from children while doing nothing to prevent them from seeing hate speech.

    ETA: Ironically, it’s the old hotbeds for microaggressions – MMOs – that are in the best place to actually do something about it – It’s easier to hire enough wizards so that there are a few on each server (possibly the same ones) at all times, operating in shifts, than games hosting a lot of private matches in matchmaked situations to hire enough to be an effective ‘bar staff’ monitoring the various rooms for toxic behavior.

  6. Jessica Reese Avatar
    Jessica Reese

    But it’s not infringing someone’s freedom of Speech. Blizzard is no the government, so therefore if their actions get them blocked, suspended or banned by blizzard, it’s not infringing on their freedom of speech. Blizzard is a company and using their service comes with rules just like Twitter or Facebook. People get banned from them all the time and you don’t see the companies being railed against for infringing Freedom of Speech except by people who do not know what that actually is, apparently.

    And apparently we value the “feelings” of the people who feel way too much and take things a lot harder about a game to the point where they behave like the mentioned examples? Why are the feeling of people abusing others more important?

  7. Gizensha Avatar

    Not finished listening to it yet, but this episode of Ludology might be somewhat relevent to this article –

    Interview with former technical lead of Player Behaviour Team for League of Legends (i.e. the team trying to reduce toxicity in that community. How well it works… Well, LoL.)

Leave a Reply

Your email address will not be published. Required fields are marked *