Google’s Perspective algorithm is a tool for censoring “toxic” speech based on word combinations that isn’t effective enough for censorship proponents. (Who come mostly from media. Oliver Darcy’s efforts on CNN were crucial to the campaign to ban Alex Jones. They should just give him the Pulitzer. Come on, msm, you know you want to.) Cable news, formerly more prestigious outlets such as the Atlantic, and of course the Huffpo-sphere all contribute to the campaign prodding the social media companies toward ever more de-platforming and censorship. Tech media provides creative technical advise.
The near future of censorship will focus on individuals and their ability to associate. Taking out Jones isn’t just about silencing him, but also about taking out a node of transmission, by which the curious find their way to more serious and ultimately, to the Narrative, damaging content. From the severely progressive site Rantt
Google’s new Perspective algorithm is a good start, but it’s just one piece of the puzzle we can’t solve with the data points from a single comment, even with the most well trained recurrent neural networks. Ultimately, we need to teach computers to follow a conversation and make an informed opinion of a person’s character, something that can’t be done by a single neural net heavily reliant on parsing language.
It’s not the character of the content but the content of your character
Understanding how to do it may be one of the most important technical issues we tackle, or lose the web to armies of trolls, bots, and people really into goose-stepping to a strongman’s tune.
Social media executives, down with the cause but retaining sympathy for the bottom line, are pressured from within as well. Their ranks are rotten with progressives clamoring for more censorship, like cops who resent not being able to bust heads:
Tech companies succeed or fail based on the talent of their developers, which gives those workers the leverage to shape the company culture. So when your engineers tell you there’s a problem, you listen. That was clear again this week when Twitter engineers took to the site to push back against CEO Jack Dorsey’s comments about why notorious conspiracy theorist Alex Jones is still on the platform when other tech companies have banished him.
Dorsey responded to his engineers publicly, thanking them for their thoughts and pledging to do better…
The pressure on Twitter to ban Jones from its platform grew exponentially this week, though, after other major companies like Apple, Facebook, and YouTube started taking action against him for violating their terms of service. On Tuesday, Dorsey tweeted, “We didn’t suspend Alex Jones or Infowars yesterday. We know that’s hard for many but the reason is simple: he hasn’t violated our rules. We’ll enforce if he does. And we’ll continue to promote a healthy conversational environment by ensuring tweets aren’t artificially amplified.”
Dorsey further explained that Twitter couldn’t ban Jones based on “succumbing to outside pressure,” and he called on journalists to continue to fact-check him. This didn’t go over well with journalists—many pointed out that we spend a lot of time fact-checking nonsense, but that it’s not our job to keep a viral disinformation incubator healthy;
it’s our job to report facts. The defense also fell flat with some current and former Twitter employees. “There is no honor in resisting ‘outside pressure’ just to pat ourselves on the back for being ‘impartial,’”
Jack, the call is coming from inside the house…!
Twitter engineer Marina Zhao tweeted. “I agree with @ekp that Twitter does not exist in a vacuum, and it is wrong to ignore the serious real-world harm, and to equate that with political viewpoints.” @ekp is Ellen Pao, formerly of Twitter and Reddit, who had earlier replied to Dorsey, “We tried treating @reddit as a silo, and it was a huge mistake. People got harassed cross-platform. Also if your site is the only one that allows this hate and harassment, it will get overrun and collapse.”
In the end taking Jones out might be the best thing for the right. The left is defusing a bomb that’s already gone off, and if Jones disappears entirely, he takes with him a reputation for crazy that is no longer applied to the right. And in all likelihood the deplatforming of Jones will work as intended.
“We’ve been running a research project over last year, and when someone relatively famous gets no platformed by Facebook or Twitter or YouTube, there’s an initial flashpoint, where some of their audience will move with them” Joan Donovan, Data and Society’s platform accountability research lead, told me on the phone, “but generally the falloff is pretty significant and they don’t gain the same amplification power they had prior to the moment they were taken off these bigger platforms.”
The sad fact is someone like Jones has nothing other than his platform–his voice. Emphasis added:
Deplatforming works “best” when the people being deplatformed don’t have any power to begin with. Nor are we talking about people from marginalized communities who have self-censored or left social media because of far right harassment and hate campaigns (and could, in theory, come back with more proactive moderation by large platforms.)
I say the author’s self conscious, he’d say thorough, but following “we’re crushing the powerless” with “but not the real powerless” is comic gold. Thank you, social justice man. Who, whom all the way down.
Once they’ve purged the net to the extent possible, expect to be hounded right into the dark web weeds:
Nonetheless, the concern among academics is that, as hate moves to the darker corners of the internet, that some of their old followers may move with them and become further radicalized. “The good that comes with deplatforming is, their main goal was to redpill or get people within mainstream communities more in line with their beliefs, so we need to get them off those platforms,” Robyn Caplan, a PhD student at Rutgers University and Data and Society affiliate, told me on the phone. “But now we’ve put them down into their holes where they were before, and they could strengthen their beliefs and become more extreme.” The question is whether it’s more harmful to society to have many millions of people exposed to kinda hateful content or to have a much smaller number of ultra-radicalized true believers.
The work of social justice never ends, or, it ends at the barrel of a gun.