President Trump Issues Executive Order Kicking Off Broad Government Review of Private Companies’ Activities
The President took sweeping action to review and regulate the practices of online platforms to address concerns about perceived bias by these private entities in moderating third-party content. The Executive Order (EO), Preventing Online Censorship, says that “we cannot allow a limited number of online platforms to hand pick the speech that Americans may access and convey on the internet.” It seeks to apply “the ideals of the First Amendment to modern communications technology.”
The EO raises significant questions about online expression and the extent to which the government may legally regulate or oversee the decisions made by private companies about their content platforms, undoubtedly sparking battles to be fought in a series of proceedings before federal agencies, some states, and in the courts. All participants in online commerce and communication should consider how their interests may be impacted by these new initiatives, which may have broad and long term legal and policy effects.
The EO is directed at the Federal Communications Commission (FCC), the Federal Trade Commission (FTC), the National Telecommunications and Information Administration (NTIA) within the Department of Commerce, the United States Attorney General and the Department of Justice (DOJ), and interested state Attorneys General. It calls for a review of federal liability protections granted to providers of online services such as social media by Section 230 of the Communications Decency Act, and scrutiny of federal spending on such services. It seeks to investigate and perhaps punish online content curation and enforce what appears to be new standards applying the First Amendment to private actors.
Certainly, the legal theories in this EO are novel and in tension with recent court decisions, which routinely held that a digital company’s decisions about what content to present or how to present it do not transform the digital platform into the “speaker” of user content. Moreover, as the D.C. Circuit reminded, “[i]n general, the First Amendment ‘prohibits only governmental abridgment of speech’” and that “a private entity who provides a forum for speech is not transformed by that fact alone into a state actor.” Freedom Watch et. al. v. Google, Inc. (D.C. Cir. May 27, 2020). The administration’s claims to be advancing First Amendment principles through the EO with respect to social media platforms stand in contrast to President Trump’s attempts to block his critics on Twitter, which the U.S. Court of Appeals for the Second Circuit unanimously found to be unconstitutional government censorship based on viewpoint. Knight First Amendment Institute at Columbia University v. Trump, 928 F.3d 226 (2d Cir. 2019).
Executive Orders are notoriously hard to challenge in court, though some parties may immediately file suit. The activities set in motion by the EO are likely to present opportunities for public comment at several agencies, and possibly further judicial review. Unlike DOJ and NTIA, the FCC and FTC are independent agencies and will have more options in deciding a path forward - including not acting at all. Notably, in response to a prior White House attempt to draft an executive order targeting social media companies, the FCC and FTC privately pushed back on being deputized to police speech.
The Focus of the EO is on Content Decisions by Online Platforms and Social Media
The EO asserts that “[o]nline platforms . . . are engaging in selective censorship that is harming our national discourse.” It cites reports of “online platforms ‘flagging’ content as inappropriate, even though it does not violate any stated terms of service; making unannounced and unexplained changes to company policies that have the effect of disfavoring certain viewpoints; and deleting content and entire accounts with no warning, no rationale, and no recourse.”
The EO leaves no doubt that it is motivated in large part by Twitter’s decision to add a fact check to certain tweets by the President, noting:
Twitter now selectively decides to place a warning label on certain tweets in a manner that clearly reflects political bias. As has been reported, Twitter seems never to have placed such a label on another politician’s tweet. As recently as last week, Representative Adam Schiff was continuing to mislead his followers by peddling the long-disproved Russian Collusion Hoax, and Twitter did not flag those tweets. Unsurprisingly, its officer in charge of so-called ‘Site Integrity’ has flaunted his political bias in his own tweets.
“Online platform” is defined broadly as “any website or application that allows users to create and share content or engage in social networking, or any general search engine.”
The EO states that “[a]t the same time online platforms are invoking inconsistent, irrational, and groundless justifications to censor or otherwise restrict Americans’ speech here at home, several online platforms are profiting from and promoting the aggression and disinformation spread by foreign governments like China.” The EO points to specific actions by online service providers, including accepting advertising from the Chinese government, establishing research partnerships in China, and allegedly amplifying propaganda abroad.
The EO further states that “[w]e must seek transparency and accountability from online platforms, and encourage standards and tools to protect and preserve the integrity and openness of American discourse and freedom of expression.” It comes after months of activity to increase scrutiny on platforms, including by DOJ, as we have previously described.
Section 2 Addresses Section 230 and Directs NTIA to File a Petition with the FCC
To achieve this desired “transparency and accountability,” the EO takes aim at Section 230, a provision of the 1996 Communications Decency Act adopted to allow providers of online services such as social media (providers of “interactive computer services,” under the law) to moderate the third-party content posted on their platforms without becoming responsible for that content. In so doing, the EO adopts a novel interpretation of Section 230 that would impose additional obligations on interactive computer service providers that seek to take advantage of the liability protection afforded by the statute. The EO says that all agencies will follow this position, which may create dissonance in agencies with independent litigating authority. And, acting through NTIA, the EO seeks to have the FCC adopt new regulations consistent with its interpretation.
Section 230 protects interactive computer service providers such as social media platforms in two ways. Under Section 230(c)(1), “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This means that interactive computer service providers are free to host third-party content without being liable for the substance of that content. Section 230(c)(2) establishes an additional liability shield for users and providers of interactive computer services who moderate their content, immunizing them from suit for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene . . . or otherwise objectionable.” As the Ninth Circuit has explained, these two subsections provide independent protections for providers: “even those who cannot take advantage of subsection (c)(1), perhaps because they developed, even in part, the content at issue . . . can take advantage of subsection (c)(2) if they act to restrict access to the content because they consider it obscene or otherwise objectionable.” Barnes v. Yahoo Inc., 570 F.3d 1096, 1105 (9th Cir. 2009).
The EO appears to promote the opposite view, seeking a “clarifi[cation]” of Section 230 immunity that could tie these provisions together by making “good faith” a prerequisite for getting protection under either (c)(1) or (c)(2), and would impose an unprecedently rigorous interpretation of what “good faith” means.
First, the EO states that the immunity of Section 230(c) “should not extend beyond its text and purpose to provide protection for those who purport to provide users a forum for free and open speech, but in reality use their power over a vital means of communication to engage in deceptive or pretextual actions stifling free and open debate by censoring certain viewpoints.” Second, although the EO asserts that not acting in “good faith” pursuant to subsection (c)(2) only prevents an interactive computer service provider from using the shield supplied by subsection (c)(2), the EO directs the Secretary of Commerce (Secretary), through NTIA, to file a petition for rulemaking with the FCC within 30 days, requesting that the FCC propose regulations to “clarify” the “interaction” between Sections (c)(1) and (c)(2), with an eye toward making the “good faith” provision applicable to both. Specifically, the FCC is to propose regulations to clarify:
(i) the interaction between subparagraphs (c)(1) and (c)(2) of section 230, in particular, to clarify and determine the circumstances under which a provider of an interactive computer service that restricts access to content in a manner not specifically protected by subparagraph (c)(2)(A) may also not be able to claim protection under subparagraph (c)(1), which merely states that a provider shall not be treated as a publisher or speaker for making third-party content available and does not address the provider’s responsibility for its own editorial decisions
Notably, a draft version of the Executive Order released last night went further than asking the FCC to consider the interaction, and flatly stated that when a provider “mak[es] itself an editor of content outside the protections of subparagraph (c)(2)(A), such a provider forfeits any protection from being deemed a ‘publisher or speaker’ under subsection 230(c)(1), which properly applies only to a provider of content supplied by others.” (emphasis added).
The EO further directs the NTIA to file a petition for rulemaking to seek FCC clarification on:
(ii) the conditions under which an action restricting access to or availability of material is not “taken in good faith” within the meaning of subparagraph (c)(2)(A) of section 230, particularly the conditions whether actions can be “taken in good faith” if they are:
(1) deceptive, pretextual, or inconsistent with a provider’s terms of service; or
(2) taken after failing to provide adequate notice, reasoned explanation, or a meaningful opportunity to be heard; and
(iii) Any other proposed regulations that the NTIA concludes may be appropriate to advance the policy described in subsection (a) of this section.
This proposed “clarification” would add, in essence, a due process requirement to decisions made by private actors to limit access to obscene, lewd, or otherwise objectionable speech. It would also expose those decisions to lengthy and potentially fact-intensive litigation, which is precisely the risk that Section 230 was enacted to avoid.
The FCC is under no obligation to act on an NTIA petition and as an independent agency will be expected to base its considerations on its statutory mandates and sound public policy. The FCC has not thus far been engaged in the debates over Section 230, which has been under scrutiny and could be amended by Congress, as proposed in the EARN It Act. We have previously analyzed proposals to change the approach to Section 230.
Responding to the NTIA’s request to create rules to assess the “good faith” of platforms’ actions and judgments could pose a challenge for the independent agency. The FCC earlier this year rejected a petition seeking a government investigation into broadcasters’ coverage of the President, stating that the agency is not "a roving arbiter of broadcasters’ editorial judgments.” The FCC’s response to the NTIA request will be important and complex; so far, Chairman Pai has indicated only that “[t]his debate is an important one. The Federal Communications Commission will carefully review any petition for rulemaking filed by the Department of Commerce.”
Section 3 Directs Agencies to Review their Online Platform Advertising Spending
The EO attempts to use the power of the purse, targeting the use of government funds on online advertising. Each executive department and agency has 30 days to review its marketing spending with online platforms and report to the Office of Management and Budget (OMB). They are to look at the amount of money spent and the online platforms supported. The DOJ will “review the viewpoint-based speech restrictions imposed by each online platform identified in the report described in subsection (b) of this section and assess whether any online platforms are problematic vehicles for government speech due to viewpoint discrimination, deception to consumers, or other bad practices.”
It is not clear what criteria DOJ will use to assess the alleged speech restrictions or whether an online platform is a “problematic vehicle” for government speech.
Section 4 Kicks off Major Activity at the Federal Trade Commission
The EO states that “[i]t is the policy of the United States that large online platforms, such as Twitter and Facebook, as the critical means of promoting the free flow of speech and ideas today, should not restrict protected speech.” Asserting that these social media sites provide an “important forum” to the public for others to engage in free expression and debate, the EO cites to PruneYard Shopping Center v. Robins, 447 U.S. 74, 85-89 (1980), a controversial case, not much extended or applied, that found a shopping center was required to host speech with which it disagreed because it allowed some speakers on its property.
The EO states that it will submit complaints received by the White House Tech Bias Reporting Tool to DOJ and the FTC.
Further, the EO directs the FTC to “consider taking action… to prohibit unfair or deceptive acts or practices in or affecting commerce” and states that unfair or deceptive acts or practice “may include practices by entities covered by section 230 that restrict speech in ways that do not align with those entities’ public representations about those practices.”
In a particularly aggressive move, the EO directs the FTC to consider whether complaints against “large online platforms,” including Twitter, “allege violations of law that implicate” the EO’s newly announced private First Amendment principle, that “large online platforms,” “should not restrict protected speech.” It directs the FTC to “consider developing a report describing such complaints and making the report publicly available.”
As an independent agency, the FTC will be expected to bring its judgment and statutory authorities to bear; they cannot be expanded by executive order. The FTC may not abide by the newly announced principle that undefined “large online platforms” should not “restrict protected speech.” In fact, the FTC may struggle to define what is “protected speech” in this context. The agency has faced First Amendment challenges in far less charged settings, making this task particularly complex. The FTC often holds workshops and seeks public comments on high profile issues and may seek further stakeholder input on how it should approach the issues outlined in the EO.
Section 5 Engages State Attorneys General to Deploy State Law to Reinforce the EO’s Mission, and to Draft Model Legislation, at the Direction of the Attorney General
The United States Attorney General is directed to establish a working group to look at the potential use of state unfair and deceptive acts and practices laws to go after online platforms, and to invite State Attorneys General for discussion and consultation.
This working group will be given the complaints submitted to the White House Tech Bias Reporting Tool and is encouraged to consider publicly available information about online platforms’ “increased scrutiny of users based on other users they choose to follow, or their interactions with other users.” It directs scrutiny of algorithms, restrictions on “the ability of users with particular viewpoints to earn money on the platform” and other issues.
Notably, the EO directs the working group to “develop model legislation for consideration by legislatures in States where existing statutes do not protect Americans from such unfair and deceptive acts and practices,” which represents a real challenge to ongoing efforts to create a uniform and predictable federal approach to online privacy and commerce.
Section 6 Calls for Draft Legislation from the Attorney General
The EO directs the Attorney General to “develop a proposal for Federal legislation that would be useful to promote the policy objectives of this order.” This broad command increases the role of the Attorney General and DOJ in ongoing debates over Section 230 and related issues.
Wiley’s TMT Practice has championed First Amendment rights for decades in regulatory proceedings, enforcement actions, and in court. We represent diverse companies and industries before each of the agencies addressed in the EO. We have worked with Section 230 prior to and since its enactment. Our team is actively engaged on these issues and available to help organizations consider how these developments may affect their operations.
If you need assistance analyzing the EO or its likely impacts on federal agencies, please reach out to one of your Wiley contacts.
View the redline of the draft EO here.