FTC Seeks Comment on Tech Content Moderation Policies
Federal Trade Commission (FTC) staff announced a Request for Information (RFI) on February 20 seeking public comments on “how consumers may have been harmed by technology platforms that limited their ability to share ideas or affiliations freely and openly.” The RFI signals a focus on tech platform practices in the new Administration and is likely to be a prelude to further investigation efforts by the FTC. It also relates to a core concern for the Trump administration: addressing perceived restrictions on free speech.
The RFI seeks comment on six topics that suggest potential theories of liability the FTC might be considering.
The RFI asks the public to comment on six overarching issues:
- The circumstances under which technology platforms deny or degrade users’ access to services;
- The companies’ public-facing policies about how they have moderated content;
- Whether users could challenge denials or degradation of service;
- The platform’s content moderation activities’ impact on users;
- The factors motivating the platforms’ adoption and enforcement of content moderation policies; and
- The extent to which platforms’ content moderation actions were made possible by a lack of competition or had an impact on competition.
The RFI does not announce any investigation or enforcement approach, but it seeks information that could potentially justify such an investigation or bolster one already underway. The RFI specifically notes that “[c]omments submitted in response to this RFI could inform the FTC’s enforcement priorities and future actions.”
For example, one RFI sub-topic appears to seek evidence of potentially deceptive conduct by asking for comments on whether platforms “adhere[d] to [their] policies or other public-facing representations.” Another sub-topic pulls language from the FTC statute prohibiting unfair business practices to ask whether any “countervailing benefits to consumers or competition” justified decisions to degrade or deny users access to services. Several sub-topics also address FTC competition issues, including whether platforms engaged in possible anticompetitive conduct by coordinating their policies with competitors, and the role of platforms’ market power in shaping their use of content moderation policies.
What can we expect next from the FTC on these issues?
Chairman Ferguson has frequently expressed strong interest in tech companies’ content moderation practices. He has previously addressed this issue in statements accompanying an industry study, an enforcement action, and the FTC’s budget for 2025, in which he said the FTC should “investigate whether social media companies ‘knowingly violated their terms of services when they deplatformed customers.’”
Typically, the FTC will commence an investigation with a civil investigative demand (CID) to the target of an investigation and/or third parties that may hold relevant documents. As the FTC moves forward, it is unclear whether any existing FTC “resolutions of authority” would allow the FTC to issue CIDs cover the matters raised in the RFI, and if not, the Commission would have to vote to approve a new resolution.
The FTC could also issue compulsory process orders using its authority under Section 6(b) of the FTC Act to study industries and issue reports to Congress and the public. Chairman Ferguson previously described undertaking these industry studies as one of the FTC’s “most important duties.”
What can technology platforms do now?
Online platforms and other tech companies should consider approaches to engagement, and consider their own content moderation practices in the context of the questions raised as well as their own First Amendment rights and protections under Section 230 of the Communications Decency Act (CDA). It’s also wise to consider these issues in concert with the approaches other regulatory agencies, like the Federal Communications Commission (FCC), are taking to Section 230.
Technology platforms within the scope of this RFI also may want to begin considering how they would respond to a 6(b) order or CID from the FTC about these issues. These considerations can be informed by tracking this RFI carefully, reviewing the comments, and meeting with FTC staff. Platforms should also review and routinely update their public-facing content moderation policies in light of current practices.
***
Wiley’s FTC Regulation Practice counsels clients on FTC compliance, investigations, enforcement, and rulemaking, and regularly advocates before the agency, and Wiley’s Telecom, Media & Technology Practice has extensive experience with issues arising under Section 230. Contact the authors for additional information.
To stay informed on all of the Executive Orders and announcements from the Trump Administration, please visit our dedicated resource center below.