Wednesday, September 13, 2023

The dystopian rise of videogame censorship


Online censorship is now coming for gamers. Videogame publisher Activision has announced that it will introduce an invasive new system of online censorship to its hugely popular Call of Duty franchise.

In a blog post last month, Activision set out the details of its new partnership with the artificial-intelligence company, Modulate. Their aim is to deliver ‘real-time voice chat moderation, at-scale’ in online gaming matches. Apparently, this is needed to stamp out the allegedly rampant menace of ‘toxic’ behaviour among gamers.

Of course, online gaming is not all sweetness and light. Call of Duty titles allow gamers to shoot and kill each other in online battlefields. Much of its success can be attributed to its online matchmaking system, which also allows players to easily communicate with each other, both within and between matches. And yes, part of the fun of this is that gamers often insult and trash talk each other. While it is true that some players can be vile and bigoted at times, this kind of behaviour is normally met with insults in kind. The worst verbal bruisings tend to be reserved for those eejits who deserve them.

In an attempt to get rid of the most abusive players, Activision is taking some incredibly draconian steps that will limit the free speech of all Call of Duty fans. Modulate’s pro-active voice-moderation tool – ToxMod – will scan and supposedly identify an increasingly broad, and endlessly vague, spectrum of ‘toxic speech’. So-called hate speech, discriminatory language, and harassment are all listed as forbidden, for instance. It will not only report what words players’ use to Activision, but also the tone and intent behind their words.

The tool is set to go online next month. For now, the AI itself will not have the power to ban players from Call of Duty games. Those decisions will still be taken by human moderators. But a system that makes censorship more ‘efficient’, by flagging up supposedly toxic speech in real time, is still a dystopian prospect.

Attempts to curb ‘toxicity’ in gaming are nothing new. Players have long been able to manually report abuse from other players to moderators. But there has been a noticeable escalation of censorship over the past decade or so. Earlier this year, Ubisoft publishing, best known for the Far Cry franchise, announced a partnership with Northumbria Police in the UK that would unite specialist police officers with moderators to tackle ‘toxic’ gaming culture. And in 2021, Intel was widely ridiculed when it introduced Bleep, a software that allows gamers to filter out so-called hate speech, according to their propensity to be offended. Bleep features a sliding scale that lets players choose how many categories of ‘toxic’ speech they’re open to hearing, including ‘none’, ‘some’, ‘most’ or ‘all’.

Activision’s new AI censorship tool is on an altogether larger scale than these initiatives. Millions of Call of Duty players will soon have their speech monitored when the system goes online. This mass surveillance – and mass censorship – is bound to rob gaming of the spontaneity and escapism that makes it worthwhile.

Gamers need to stand up to this attack on their freedoms.

https://www.spiked-online.com/2023/09/13/the-dystopian-rise-of-videogame-censorship/

***********************************

My other blogs. Main ones below:

http://edwatch.blogspot.com (EDUCATION WATCH)

http://antigreen.blogspot.com (GREENIE WATCH)

http://pcwatch.blogspot.com (POLITICAL CORRECTNESS WATCH)

http://australian-politics.blogspot.com/ (AUSTRALIAN POLITICS)

http://dissectleft.blogspot.com (DISSECTING LEFTISM)

https://immigwatch.blogspot.com/ (IMMIGRATION WATCH)

https://awesternheart.blogspot.com/ (THE PSYCHOLOGIST)

http://jonjayray.com/blogall.html More blogs

*******************************


1 comment:

Stan B said...

Oh...if only there were some user controllable technology that made it impossible for the abusive gamer to be hear by those who feel that player is being abusive. If only.....then these system wide draconian measures would not be necessary.

Maybe...just maybe....a simple mouse click to MUTE the person being an asshat? Maybe, just maybe, a way to flag the player as NOT someone you want to play against ever again because they are an asshat?

What's that? Puts too much responsibility on the individual player? Too much for him to perform a mouseclick or two?

Yes...Big Brother is such a better option.....