Open Now
Open Now
Watch now

The market, not lawyers, will ultimately decide society’s preferred outcome for Internet Censorship

It is common to hear social media figures and pundits deride censorship on social media and other technology platforms. As I wrote in Towards Data Science, this threat is very real, especially during the COVID-19 public health crisis. However, the exaggeration of the threat plays into our society’s growing fondness for “grievance culture.” The story …

It is common to hear social media figures and pundits deride censorship on social media and other technology platforms. As I wrote in Towards Data Science, this threat is very real, especially during the COVID-19 public health crisis. However, the exaggeration of the threat plays into our society’s growing fondness for “grievance culture.” The story goes: 1) high-profile person posts content online 2) platform-verified user reports the content as having “no factual basis” or as “fake news” 3) platform reviews the complaint and removes the content for violating its overbroad Terms of Service or Community Guidelines, and 4) that person’s community complains about censorship of ideas. Rinse, wash, repeat.

This line of thinking has become all too common. We can continue down this path where everyone derides these platforms without providing any real solutions, but this will do no good for society. Instead, we can look to legal arguments in defense of freedom of speech. The possible success of such arguments may bolster the case for establishing new technology platforms that follow less arbitrary Terms of Service and Community Guidelines.

Is there a constitutional case for providing recourse, under the guise of freedom of speech, on these platforms? That answer appears to be no. Here is why:

  1. In Hudgens v. National Labor Relations Board, a majority of the Supreme Court held “While statutory or common law may in some situations extend protection or provide redress against a private corporation or person who seeks to abridge the free expression of others, no such protection or redress is provided by the Constitution itself.” In essence, the majority points to Congress to provide redress, because the Constitution is silent on this controversy between individuals and private corporations.
  2. The Supreme Court’s 2019 ruling in Manhattan Community Access Corp. v. Halleck provides an interesting analysis on the “state-action doctrine.” According to the Supreme Court, the doctrine holds that a private entity may be a state actor if it functions in a capacity “traditionally exclusively reserved to the State.” Specifically, the Court notes “to qualify as a traditional, exclusive public function within the meaning of our state-action precedents, the government must have traditionally and exclusively performed the function.”
    Our current societal dilemma brings a similar question to mind. If Twitter were to, hypothetically, remove the President’s tweets or even his account, he may likely be unable to sue Twitter for removing his content. After all, the court is likely to view a social media platform as not falling under the “state-action doctrine,” which may raise further questions. A federal appeals court held that the President’s tweets constituted official statements so does this imply that the platform cannot remove the President’s tweets or accounts? Only time will tell.
  3. Prager University v. Google addresses a very similar question. Namely, the Ninth Circuit Court of Appeals held that “despite YouTube’s ubiquity and its role as a public-facing platform, it remains a private forum, not a public forum subject to judicial scrutiny under the First Amendment.” The court cites Manhattan Community Access Corp v. Halleck in stating that the Internet does not alter the “state-action doctrine.” Prager University’s contention in this case is, ultimately, similar to the many complaints against these technology platforms. These platforms rely on enforcing the Terms of Service and Community Guidelines, and as public companies have every right to “remove content that violates its Terms of Service, or restrict otherwise objectionable videos.” As arbitrarily applied as these policies may seem, these companies are well within their rights.
  4. Federal law affords these technology platforms with federal protections in the form of “broad immunity from liability for their users’ actions, as well as wide latitude to police content,” writes The Wall Street Journal. President Trump and Congressional Republicans have long derided anti-conservative bias; this discussion has prompted the possible establishment of a panel “to review complaints of…bias on social media.” The relative importance of this panel will be contingent upon its proposed solutions. Some members of Congress have proposed 1) repealing this federal immunity 2) using anti-monopoly laws to break up these entities and 3) a new 21st-century version of the Fairness Doctrine.

Let’s assume that federal immunity is repealed and now these platforms become liable for content, making them subject to much more stringent policing. The possibility of serious infringement on free speech is much more likely, as these companies are forced to adopt overly restrictive rules on content creation. This is a very serious risk as trial lawyers would likely view this as an opportunity to score easy money from the deep-pocketed technology companies.

The legal question pertaining to internet censorship is unlikely to be addressed by courts. Instead, individuals, interest groups, and Congress can push for serious changes to how these platforms operate, but that assumes there is a broad census that things are broken. Americans are relying on technology platforms for their news at record rates, and technology platforms have provided the sort of competition to legacy news outlets that is necessary to ensure everyone has a voice.

The Wall Street Journal’s Gerald Seib writes, “democracy was never intended to be the neatest form of government.” Social media provides everyday individuals with an opportunity to express themselves, while also demonstrating how fundamentally different each individual is. Internet activist Eli Pariser writes of Internet algorithms that “democracy relies on shared facts; instead, we’re being offered parallel but separate universes.” This disturbing truth is realized when no common alignment exists on what is fact versus fiction.

Current law allows these platforms broad leeway for deciding what content is objectionable and what is not, even though the Constitution is silent. The fundamental question for our society is what degree of content filtering would we find acceptable. The market, not lawyers, will ultimately decide society’s preferred outcome for these technology platforms. Individual content creators, such as Joe Rogan, and consumers of content will drive the future of media consumption to platforms where the possibility of censorship will ultimately not prevail.

Follow us on Google News

Filed under