Clicky

Section 230 A Focus Of Criticism, Reform Proposals – Technology

In Short

The Situation: Section 230 of the
Communications Decency Act of 1996 gives technology companies
immunity from liability for content hosted on their networks while
recognizing their ability to restrict content they deem
inappropriate.

The Result: As lawmakers weigh the lessons of
recent elections and the role of technology companies in
transmitting political speech and reported misinformation, Section
230 is the subject of intense criticism and possible reform in
Washington, D.C.

Looking Ahead: Washington lawmakers are
pressing Section 230 reform proposals that could impact the
liability protections afforded to technology companies, while
seeking to preserve the ability of those companies to host the open
and free-flowing discussions that are essential to the democratic
process.

Section 230

Section 230 gives providers of technology companies two
important liability protections.

First, they cannot be held liable for content that a third party
posts on their platforms, effectively barring suits against
technology companies for defamation, negligence, and violations of
state antidiscrimination laws.

Second, they cannot be held liable for filtering or restricting
obscene or otherwise objectionable material, nor can they be held
liable when they choose not to filter or restrict this
material.

These two provisions shield technology companies from lawsuits
seeking to hold them liable for content hosted on their networks.
They are traditionally seen as being instrumental to the rise of a
free and open internet and vibrant online economy and media
environment in the United States in the quarter century since
Section 230’s enactment. Yet now the framework established by
Section 230 is coming under new and sustained criticism, including
from emerging left–right coalitions of lawmakers and policy
analysts.

Criticisms

Section 230 has been criticized for two main reasons.

First, critics argue that, even though Section 230 permits
technology companies to restrict harmful content such as child
pornography, hate speech, misinformation, and terrorist activity,
the liability shield provided by Section 230 fails to incentivize
technology companies to adequately carry out these functions.

Second, critics argue that by permitting technology companies to
censor “violent, harassing, or otherwise objectionable”
material, Section 230 is providing a pretext to technology
companies to censor unpopular political views on the internet.

Last year, Supreme Court Justice Thomas joined the chorus of
critics, observing that “many courts have construed the law
broadly to confer sweeping immunity on some of the largest
companies in the world,” and calling on the Supreme Court to
“consider whether the text of this increasingly important
statute aligns with the current state of immunity enjoyed by
Internet platforms.” More recently, he noted how
“applying old doctrines to new digital platforms is rarely
straightforward” and questioned whether the power to regulate
speech on private digital platforms “could lawfully be
modified.”

Reform Efforts

Washington lawmakers have turned these criticisms into ideas for
reform with several legislative proposals aimed at
“fixing” Section 230, while seeking to preserve the
underlying goal of promoting robust debate and viewpoints on the
internet.

For example, Senators Schatz (D-HI) and Thune (R-SD) introduced
the PACT Act to require platforms to issue transparency reports and
remove content within four days of a court determination that the
content is illegal.

More recently, Senators Warner (D-VA), Hirono (D-HI), and
Klobuchar (D-MN) introduced the SAFE TECH Act to limit the scope of
immunity for paid content and permit actions under civil rights
laws; antitrust laws; stalking, harassment, or intimidation laws;
human rights law; or civil wrongful death statutes.

The content of this article is intended to provide a general
guide to the subject matter. Specialist advice should be sought
about your specific circumstances.

-