Alex Jones is famous for spreading conspiracy theories, and to stop people from believing in his crazy ideas, Facebook, YouTube, Instagram and iTunes conspired to ban him on the same day. These companies have clearly not heard of the “Streisand effect” because millions of new viewers flocked to his website, and Jones reported a record number of new subscribers.
Many people met the totalitarian crackdown on free speech with great concern. “First they came for Alex Jones, then they came for you” was a popular meme in its aftermath. Some dismissed these fears as hysteria, but then YouTube censors struck again, this time against the family-friendly liberal channel H3 Productions, which has more than six million subscribers.
In the middle of a live stream where they discussed the banning of Alex Jones and the issue of free speech, YouTube revoked the channel’s livestreaming rights. Note that H3 did not endorse or promote any of Jones’ conspiracy theories, or the content that has been deemed “hateful.” All the hosts did was to discuss censorship in the context of what happened to Alex Jones.
After Ethan Klein of H3 Productions tweeted about the incident, YouTube quickly replied that they had made a mistake. The company removed the community strike and reinstantiated H3’s livestreaming right.
Hi Ethan – Our team looked into this and determined the live stream was taken down incorrectly. The strike has been removed. Our apologies and thank you for your patience!
— Team YouTube (@TeamYouTube) August 11, 2018
It could certainly be true that YouTube made an error, but it is still worrisome that a service for a multimillion subscriber channel can be removed so easily. Imagine if an electrical utility had done the same thing: In the middle of a critical operation at a hospital the power is cut, or a factory is shut down, and the utility provider just says “oops.”
That would have been deemed utterly unacceptable and become a major scandal, but with YouTube – arguably the most important infrastructure for free speech in the world – it’s just an ordinary Tuesday. It happens.
This brings us to another debate that has been raging. Should the major social media companies such as Facebook and YouTube be regulated as public utilities? The idea is that your local electrical utility should not be allowed to shut off your electricity because you voted for Donald Trump, because that would constitute such reduction of your capacity to operate freely in society that it is a form of structural violence.
Another argument being made is that newspapers, journals and magazines have editorial responsibility. That is, if someone publishing in a journal makes threats of violence or libelous claims, the publication can be sued or criminally charged for allowing such illegal or defamatory statements to be published.
When Facebook, Twitter and YouTube appeared on the scene they were adamant about not being a publication, because that would mean that they were legally responsible for everything they publish on their platform, which would be a death sentence to their business model.
They therefore argued that they were only the infrastructure for communication. Just like a phone company cannot be held responsible for defamation communicated via their phone lines, so Facebook, Twitter and YouTube should be exempt from editorial responsibility.
That’s a fair argument if they do not editorialize the content. However, editorializing is essentially what the colluders did when they decided to ban InfoWars from their platforms. The solution to the problem may therefore be to insist that if they want to interfere with the content published on their platforms, they must also take full editorial responsibility.
A TECHNOLOGICAL SOLUTION
On his Periscope channel, Dilbert-creator Scott Adams asked why Twitter and others don’t have a mode for turning content filters on and off. If you don’t want to see offensive content, turn on a filter that blocks it.
He speculates that there may be a technological reason that this has not been done, because it would be a simple solution that fixes the whole problem of people having to see material that they find offensive.
Adams is wrong. There are no technical issues blocking this idea. Sites such as Reddit and Slashdot have had advanced filter and recommendation systems for many years. It works.
The problem that YouTube and the other progressive sites are trying to solve is a completely different one. Progressives don’t just want to be able to block content they find offensive from their own feed. They want to block it from your feed. They don’t want anyone to be exposed to ideas of which they disapprove.
That’s why social justice warriors turn up at conservative conferences: Not to protest but to make noise and “punch a Nazi.” They want to shut down free speech.
In the long run, the technological solution that Adams outlines is the way to go, but first the far-left social media platforms need to embrace free speech. The solution may be to give these companies the right incentive by threatening them with the burden of full editorial responsibility.