Why Is Big Tech Policing Free Speech? Because the Government Isn’t

In the months leading up to the November election, the social media platform Parler attracted millions of new users by promising something competitors, increasingly, did not: unfettered free speech. “If you can say it on the streets of New York,” promised the company’s chief executive, John Matze, in a June CNBC interview, “you can say it on Parler.”

The giants of social media — Facebook, Twitter, YouTube, Instagram — had more stringent rules. And while they still amplified huge amounts of far-right content, they had started using warning labels and deletions to clamp down on misinformation about Covid-19 and false claims of electoral fraud, including in posts by President Trump. Conservative figures, including Senator Ted Cruz, Eric Trump and Sean Hannity, grew increasingly critical of the sites and beckoned followers to join them on Parler, whose investors include the right-wing activist and heiress Rebekah Mercer. The format was like Twitter’s, but with only two clear rules: no criminal activity and no spam or bots. On Parler, you could say what you wanted without being, as conservatives complained, “silenced.”

After the election, as Trump sought to overturn his defeat with a barrage of false claims, Matze made a classic First Amendment argument for letting the disinformation stand: More speech is better. Let the marketplace of ideas run without interference. “If you don’t censor, if you don’t — you just let him do what he wants, then the public can judge for themselves,” Matze said of Trump’s Twitter account on the New York Times podcast “Sway.” “Just sit there and say: ‘Hey, that’s what he said. What do you guys think?’”

Matze was speaking to the host of “Sway,” Kara Swisher, on Jan. 7 — the day after Trump told supporters to march on the U.S. Capitol and fight congressional certification of the Electoral College vote. In the chaos that followed Trump’s speech, the American marketplace of ideas clearly failed. Protecting democracy, for Trump loyalists, had become a cry to subvert and even destroy it. And while Americans’ freedoms of speech and the press were vital to exposing this assault, they were also among its causes. Right-wing media helped seed destabilizing lies; elected officials helped them grow; and the democratizing power of social media spread them, steadily, from one node to the next.

Social media sites effectively function as the public square where people debate the issues of the day. But the platforms are actually more like privately owned malls: They make and enforce rules to keep their spaces tolerable, and unlike the government, they’re not obligated to provide all the freedom of speech offered by the First Amendment. Like the bouncers at a bar, they are free to boot anyone or anything they consider disruptive. In the days after Jan. 6, they swiftly cracked down on whole channels and accounts associated with the violence. Reddit removed the r/DonaldTrump subreddit. YouTube tightened its policy on posting videos that called the outcome of the election into doubt. TikTok took down posts with hashtags like #stormthecapitol. Facebook indefinitely suspended Trump’s account, and Twitter — which, like Facebook, had spent years making some exceptions to its rules for the president — took his account away permanently.

Parler, true to its stated principles, did none of this. But it had a weak point: It was dependent on other private companies to operate. In the days after the Capitol assault, Apple and Google removed Parler from their app stores. Then Amazon Web Services stopped hosting Parler, effectively cutting off its plumbing. Parler sued, but it had agreed, in its contract, not to host content that “may be harmful to others”; having promised the streets of New York, it was actually bound by the rules of a kindergarten playground. In a court filing, Amazon provided samples of about 100 posts it had notified Parler were in violation of its contract in the weeks before the Capitol assault. “Fry ’em up,” one said, with a list of targets that included Nancy Pelosi and Chuck Schumer. “We are coming for you and you will know it.” On Jan. 21, a judge denied Parler’s demand to reinstate Amazon’s services.

It’s unlikely the volume of incendiary content on Parler could rival that of Twitter or Facebook, where groups had openly planned for Jan. 6. But Parler is the one that went dark. A platform built to challenge the oligopoly of its giant rivals was deplatformed by other giants, in a demonstration of how easily they, too, could block speech at will.

Over all, the deplatforming after Jan. 6 had the feeling of an emergency response to a wave of lies nearly drowning our democracy. For years, many tech companies had invoked the American ethos of free speech while letting disinformation and incitement spread abroad, even when it led to terrible violence. Now they leapt to action as if, with America in trouble, American ideals no longer applied. Parler eventually turned to overseas web-hosting services to get back online.

“We couldn’t beat you in the war of ideas and discourse, so we’re pulling your mic” — that’s how Archon Fung, a professor at Harvard’s Kennedy School of Government, put it, in expressing ambivalence about the moves. It seemed curiously easier to take on Trump and his allies in the wake of Democrats’ victories in the Senate runoffs in Georgia, giving them control of both chambers of Congress along with the White House. (Press officers for Twitter and Facebook said no election outcome influenced the companies’ decision.) And in setting an example that might be applied to the speech of the other groups — foreign dissidents, sex-worker activists, Black Lives Matter organizers — the deplatforming takes on an ominous cast.

Fadi Quran, a campaign director for the global human rights group Avaaz, told me he, too, found the precedent worrying. “Although the steps may have been necessary to protect American lives against violence,” he said, “they are a reminder of the power big tech has over our information infrastructure. This infrastructure should be governed by deliberative democratic processes.”

But what would those democratic processes be? Americans have a deep and abiding suspicion of letting the state regulate speech. At the moment, tech companies are filling the vacuum created by that fear. But do we really want to trust a handful of chief executives with policing spaces that have become essential parts of democratic discourse? We are uncomfortable with government doing it; we are uncomfortable with Silicon Valley doing it. But we are also uncomfortable with nobody doing it at all. This is a hard place to be — or, perhaps, two rocks and a hard place.

When Twitter banned Trump, he found a seemingly unlikely defender: Chancellor Angela Merkel of Germany, who criticized the decision as a “problematic” breach of the right to free speech. This wasn’t necessarily because Merkel considered the content of Trump’s speech defensible. The deplatforming troubled her because it came from a private company; instead, she said through a spokesman, the United States should have a law restricting online incitement, like the one Germany passed in 2017 to prevent the dissemination of hate speech and fake news stories.

Among democracies, the United States stands out for its faith that free speech is the right from which all other freedoms flow. European countries are more apt to fight destabilizing lies by balancing free speech with other rights. It’s an approach informed by the history of fascism and the memory of how propaganda, lies and the scapegoating of minorities can sweep authoritarian leaders to power. Many nations shield themselves from such anti-pluralistic ideas. In Canada, it’s a criminal offense to publicly incite hatred “against any identifiable group.” South Africa prosecutes people for uttering certain racial slurs. A number of countries in Europe treat Nazism as a unique evil, making it a crime to deny the Holocaust.

In the United States, laws like these surely wouldn’t survive Supreme Court review, given the current understanding of the First Amendment — an understanding that comes out of our country’s history and our own brushes with suppressing dissent. The First Amendment did not prevent the administration of John Adams from prosecuting more than a dozen newspaper editors for seditious libel or the Socialist and labor leader Eugene V. Debs from being convicted of sedition over a speech, before a peaceful crowd, opposing involvement in World War I. In 1951, the Supreme Court upheld the convictions of Communist Party leaders for “conspiring” to advocate the overthrow of the government, though the evidence showed only that they had met to discuss their ideological beliefs.

It wasn’t until the 1960s that the Supreme Court enduringly embraced the vision of the First Amendment expressed, decades earlier, in a dissent by Justice Oliver Wendell Holmes Jr.: “The ultimate good desired is better reached by free trade in ideas.” In Brandenburg v. Ohio, that meant protecting the speech of a Ku Klux Klan leader at a 1964 rally, setting a high bar for punishing inflammatory words. Brandenburg “wildly overprotects free speech from any logical standpoint,” the University of Chicago law professor Geoffrey R. Stone points out. “But the court learned from experience to guard against a worse evil: the government using its power to silence its enemies.”

This era’s concept of free speech still differed from today’s in one crucial way: The court was willing to press private entities to ensure they allowed different voices to be heard. As another University of Chicago law professor, Genevieve Lakier, wrote in a law-review article last year, a hallmark of the 1960s was the court’s “sensitivity to the threat that economic, social and political inequality posed” to public debate. As a result, the court sometimes required private property owners, like TV broadcasters, to grant access to speakers they wanted to keep out.

But the court shifted again, Lakier says, toward interpreting the First Amendment “as a grant of almost total freedom” for private owners to decide who could speak through their outlets. In 1974, it struck down a Florida law requiring newspapers that criticized the character of political candidates to offer them space to reply. Chief Justice Warren Burger, in his opinion for the majority, recognized that barriers to entry in the newspaper market meant this placed the power to shape public opinion “in few hands.” But in his view, there was little the government could do about it.

Capitol Riot Fallout

From Riot to Impeachment

The riot inside the U.S. Capitol on Wednesday, Jan. 6, followed a rally at which President Trump made an inflammatory speech to his supporters, questioning the results of the election. Here’s a look at what happened and the ongoing fallout:

    • As this video shows, poor planning and a restive crowd encouraged by President Trump set the stage for the riot.
    • A two hour period was crucial to turning the rally into the riot.
    • Several Trump administration officials, including cabinet members Betsy DeVos and Elaine Chao, announced that they were stepping down as a result of the riot.
    • Federal prosecutors have charged more than 70 people, including some who appeared in viral photos and videos of the riot. Officials expect to eventually charge hundreds of others.
    • The House voted to impeach the president on charges of “inciting an insurrection” that led to the rampage by his supporters.

    Source: Read Full Article