taken exactly the kind of firm, decisive action we've come to expect of the company; it's added some graphs to the interface. Nothing's going to top the sheer shade of Gamasutra's headline on this topic - "Steam considers preventing review bombs, adds graphs instead" - which perfectly sums up the exasperation so many developers feel with the service at this point. "We're aware of the problem, and doing as little as humanly possible to fix it" is in many ways the worst response the company could have conjured.
Review bombs, for the uninitiated, are when a mob descends on Steam to leave a flood of poor reviews - often accompanied by a grossly offensive comment - on a game whose developer has done something to raise their hackles. The most recent example was the review-bombing of Firewatch after its developers demanded PewDiePie remove videos of the game following his latest racist "slip of the tongue", but dozens of developers have been targeted for this kind of treatment, often simply for the sin of being female or a minority, or for in some way otherwise upsetting the denizens of Reddit and 4chan. Review bombing can hammer a game's rating on Steam and consequently seriously damage its sales - it's a direct and effective way for online mobs to damage the livelihoods of developers.
"What's most frustrating is that the data Valve is now revealing makes it clear how trivial it would be to fix this problem if they actually cared"
Valve's response has been to effectively shrug off responsibility for these problems and place it entirely on the shoulders of its customers. Steam users will now be expected to look at a graph for a game to make sure it's not being review bombed, a process that assumes that they've bothered looking at a game with a low review score in the first place, that they understand what review bombing is, and that they know what it looks like on a graph - all of which seem vanishingly unlikely, making this move entirely unhelpful. What's most frustrating is that the data Valve is now revealing makes it clear how trivial it would be to fix this problem if they actually cared; an automated system that flagged possible review bombing for human review would be beyond trivial to implement.
This is the crux; the problem is not technological, it's philosophical. Valve clearly has a strong philosophy and political standpoint about how online services should work and about how the freedom of users (including users who haven't actually bought a game but still want to give it a 1-star review and call its creator a variety of colourful names) should be balanced against the protection of creators' livelihoods. Moreover, it has a belief that where problems do arise, they can be fixed with more data and tweaked algorithms; this reflects the broad Silicon Valley ideological fervour about the power and purity of algorithms, which conveniently ignores the extraordinary degree to which code that deals with human behaviour tends to reflect the implicit biases of its authors. Valve absolutely knows it could fix this issue with tighter rules, more human supervision and, yes, some better algorithms to report back to those supervisors; it just doesn't want to do that, because that's not its philosophy.
On the flip side of this coin, you've got Blizzard - a company which embraces technology with every bit as much fervour as Valve, but takes a very different approach to protecting its players and policing its community. Last week, Overwatch game director Jeff Kaplan laid things out very clearly - the team has absolutely refocused personnel and resources away from other features and content in order to address player toxicity in the Overwatch community.
"There will always be a subset of people who get their kicks from aggressively ruining other people's fun... For them, harassment and abuse is the game, more so than the game itself"
It's throwing people, time and money into making its blockbuster game into an environment where people can play without facing abuse and harassment. Some of that effort boils down to developing smarter systems - Kaplan bemoaned the fact that resources have been diverted from other features in order to implement player-reporting on consoles, a feature Overwatch initially lacked on those platforms in the belief that consoles' own native player management systems would suffice. Other simple aspects of the game's systems and designs are clearly designed to avoid player harassment - dropping kill:death ratios from the scoreboard is one such move, for example.
Ultimately, though, the work Blizzard is doing on Overwatch - and that other companies that genuinely care about this issue, such as League of Legends creator Riot, are doing on their games - is about people, not algorithms. Clever design and good code gets you so far in managing human behaviour, but there will always be a subset of people who get their kicks from aggressively ruining other people's fun, and will devote themselves to learning to game any system you throw in their way. For them, harassment and abuse is the game, more so than the game itself; to deal with them, ultimately, you need living, breathing people empowered to make good decisions on behalf of the community.
This is, perhaps, what Valve doesn't understand - or is the understanding which it rejects on the grounds of its philosophy. Review bombing is a form of "gaming" Steam's systems; it arose because groups of people discovered that it was possible to manipulate Steam's data and algorithms in a way that achieved their political objectives. You can't fix that by just providing more data; the only way to prevent it from impacting on the income and livelihoods of developers (who, let's remember, give Steam a hell of a cut of their sales, and deserve to be treated with a damned sight more respect by Valve) is to seal up loopholes and add a layer of human supervision, in the form of people with clear priorities to protect creators and preserve Steam as a great retail platform, not a vector for online abuse mobs.
Blizzard's philosophy here (and again, it's not just Blizzard - plenty of other game operators also recognise and value of properly policing and improving their communities) is a great counter-example, and a window into what a service like Steam could be like if Valve rooted out some of this counter-productive philosophy from its way of doing business.
Steam is one of the most important developments in the games business in the last 20 years; it effectively rescued PC gaming from commercial doldrums and placed it on a path to the rude health it's in now. Its very importance, both commercially and culturally, cries out for better management; for it to slide into being a hostile environment for developers and a vector for online abuse (and thus also a pretty unpleasant place for the average consumer to shop for games) is a deeply worrying and unfortunate outcome.
Valve's critics don't pile on the company out of dislike, but out of disappointment at seeing such a vital part of the industry go to seed; if the company is really serious about fixing these issues, we'd suggest that a good first step would be to put down the graphs and see if maybe Blizzard is free for a chat on the phone.
No comments :
Post a Comment