Today, on behalf of the Trump Administration, the Department of Justice sent draft legislation to Congress to reform Section 230 of the Communications Decency Act. The draft legislative text implements reforms that the Department of Justice deemed necessary in its June Recommendations and follows a yearlong review of the outdated statute. The legislation also executes President Trump’s directive from the Executive Order on Preventing Online Censorship.
“For too long Section 230 has provided a shield for online platforms to operate with impunity,” said Attorney General William P. Barr. “Ensuring that the internet is a safe, but also vibrant, open and competitive environment is vitally important to America. We therefore urge Congress to make these necessary reforms to Section 230 and begin to hold online platforms accountable both when they unlawfully censor speech and when they knowingly facilitate criminal activity online.”
“The Department’s proposal is an important step in reforming Section 230 to further its original goal: providing liability protection to encourage good behavior online,” said Deputy Attorney General Jeffrey A. Rosen. “The proposal makes clear that, when interactive computer services willfully distribute illegal material or moderate content in bad faith, Section 230 should not shield them from the consequences of their actions.”
The Department of Justice is grateful to all the experts, victims’ groups, academics, businesses, and other stakeholders that have and continue to engage closely with the department during this process. The draft legislation reflects important and helpful feedback received thus far. The department is also grateful to our colleagues in Congress for their support on Section 230 reform and looks forward to continued engagement moving forward.
The Department of Justice’s draft legislation focuses on two areas of reform, both of which are, at minimum, necessary to recalibrate the outdated immunity of Section 230.
Promoting Transparency and Open Discourse
First, the draft legislation has a series of reforms to promote transparency and open discourse and ensure that platforms are fairer to the public when removing lawful speech from their services.
The current interpretations of Section 230 have enabled online platforms to hide behind the immunity to censor lawful speech in bad faith and is inconsistent with their own terms of service. To remedy this, the department’s legislative proposal revises and clarifies the existing language of Section 230 and replaces vague terms that may be used to shield arbitrary content moderation decisions with more concrete language that gives greater guidance to platforms, users, and courts.
The legislative proposal also adds language to the definition of “information content provider” to clarify when platforms should be responsible for speech that they affirmatively and substantively contribute to or modify.
Addressing Illicit Activity Online
The second category of amendments is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation claims.
Section 230 immunity is meant to incentivize and protect online Good Samaritans. Platforms that purposely solicit and facilitate harmful criminal activity — in effect, online Bad Samaritans — should not receive the benefit of this immunity. Nor should a platform receive blanket immunity for continuing to host known criminal content on its services, despite repeated pleas from victims to take action.
The department also proposes to more clearly carve out federal civil enforcement actions from Section 230. Although federal criminal prosecutions have always been outside the scope of Section 230 immunity, online crime is a serious and growing problem, and there is no justification for blocking the federal government from civil enforcement on behalf of American citizens.
Finally, the department proposes carving out certain categories of civil claims that are far outside Section 230’s core objective, including offenses involving child sexual abuse, terrorism, and cyberstalking. These amendments, working together, will be critical first steps in enabling victims to seek redress for the most serious of online crimes.
The Justice Department’s proposals are available here.
DEPARTMENT OF JUSTICE’S REVIEW OF SECTION 230 OF THE COMMUNICATIONS DECENCY ACT OF 1996
As part of the President’s Executive Order on Preventing Online Censorship, and as a result of the Department’s long standing review of Section 230, the Department has put together the following legislative package to reform Section 230. The proposal focuses on the two big areas of concern that were highlighted by victims, businesses, and other stakeholders in the conversations and meetings the Department held to discuss the issue. First, it addresses unclear and inconsistent moderation practices that limit speech and go beyond the text of the existing statute. Second, it addresses the proliferation of illicit and harmful content online that leaves victims without any civil recourse. Taken together, the Department’s legislative package provides a clear path forward on modernizing Section 230 to encourage a safer and more open internet.
The Department identified four areas ripe for reform:
1. Incentivizing Online Platforms to Address Illicit Content
The first category of potential reforms is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation.
a. Bad Samaritan Carve-Out. First, the Department proposes denying Section 230 immunity to truly bad actors. The title of Section 230’s immunity provision—“Protection for ‘Good Samaritan’ Blocking and Screening of Offensive Material”—makes clear that Section 230 immunity is meant to incentivize and protect responsible online platforms. It therefore makes little sense to immunize from civil liability an online platform that purposefully facilitates or solicits third-party content or activity that would violate federal criminal law.
b. Carve-Outs for Child Abuse, Terrorism, and Cyber-Stalking. Second, the Department proposes exempting from immunity specific categories of claims that address particularly egregious content, including (1) child exploitation and sexual abuse, (2) terrorism, and (3) cyber-stalking. These targeted carve-outs would halt the over-expansion of Section 230 immunity and enable victims to seek civil redress in causes of action far afield from the original purpose of the statute.
c. Case-Specific Carve-outs for Actual Knowledge or Court Judgments. Third, the Department supports reforms to make clear that Section 230 immunity does not apply in a specific case where a platform had actual knowledge or notice that the third party content at issue violated federal criminal law or where the platform was provided with a court judgment that content is unlawful in any respect.
2. Clarifying Federal Government Enforcement Capabilities to Address Unlawful Content
A second category reform would increase the ability of the government to protect citizens from harmful and illicit conduct. These reforms would make clear that the immunity provided by Section 230 does not apply to civil enforcement actions brought by the federal government. Civil enforcement by the federal government is an important complement to criminal prosecution.
3. Promoting Competition
A third reform proposal is to clarify that federal antitrust claims are not covered by Section 230 immunity. Over time, the avenues for engaging in both online commerce and speech have concentrated in the hands of a few key players. It makes little sense to enable large online platforms (particularly dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not on third-party speech.
4. Promoting Open Discourse and Greater Transparency
A fourth category of potential reforms is intended to clarify the text and original purpose of the statute in order to promote free and open discourse online and encourage greater transparency between platforms and users.
a. Replace Vague Terminology in (c)(2). First, the Department supports replacing the vague catch-all “otherwise objectionable” language in Section 230(c)(2) with “unlawful” and “promotes terrorism.” This reform would focus the broad blanket immunity for content moderation decisions on the core objective of Section 230—to reduce online content harmful to children—while limiting a platform’s ability to remove content arbitrarily or in ways inconsistent with its terms or service simply by deeming it “objectionable.”
b. Provide Definition of Good Faith. Second, the Department proposes adding a statutory definition of “good faith,” which would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and accompanied by a reasonable explanation, unless such notice would impede law enforcement or risk imminent harm to others. Clarifying the meaning of “good faith” should encourage platforms to be more transparent and accountable to their users, rather than hide behind blanket Section 230 protections.
c. Explicitly Overrule Stratton Oakmont to Avoid Moderator’s Dilemma. Third, the Department proposes clarifying that a platform’s removal of content pursuant to Section 230(c)(2) or consistent with its terms of service does not, on its own, render the platform a publisher or speaker for all other content on its service.