On October 20, 2023, the Knight Institute hosted a closed convening to explore the question of jawboning: informal government efforts to persuade, cajole, or strong-arm private platforms to change their content-moderation policies. Participants in that workshop wrote short notes to outline their thinking on this complex topic, which the Knight Institute published in the weeks leading up to and after the convening. This blog post is part of that series.

***

In some areas of technology policy, process helps to remedy problems of substance. In intellectual property and law enforcement, for instance, governments and companies have developed sophisticated processes for handling decisions, and those processes help to navigate thorny substantive questions around privacy, security, property rights, and expression.

Both areas of law are imperfect—critics claim they undermine rights and improperly privilege certain interests over others. But even if the balance they strike is fraught and tenuous, it is a balance. Could we apply what we have learned from these approaches to try to guide governments and companies toward a better approach to jawboning?

Jawboning is the government’s use of pressure to try to influence company decision-making. It’s been in the spotlight recently, with judges holding that the Biden administration’s communications with Meta about its pandemic content moderation practices were unconstitutional. A few weeks ago, the Knight First Amendment Institute convened a group of scholars and practitioners to discuss the topic, and published a series of blog posts on it.

It’s not particularly difficult to articulate why jawboning is problematic in some cases, such as when a government official threatens to file an antitrust case if a platform refuses to remove lawful speech. But it’s also easy to see why sweeping remedies—like prohibiting all government communication with platforms about issues related to content moderation, or like limiting information sharing between academic institutions, companies, and governments—will wipe out the kind of educational, informative communication that can be critical to improving both tech products and tech policy. The challenge is to develop solutions that mitigate jawboning’s most abusive forms, while preserving the benefits of company-government communication.

The need for a solution is apparent, and we need one soon. Jawboning is an everyday practice in the relationship between government officials and tech companies. It is not a Republican issue or a Democratic issue—both parties jawbone. And it is not an American phenomenon. Governments around the world do it too.

It’s also clear that despite the recent court decisions, jawboning will continue, and likely largely unabated. For instance, the recent White House executive order on artificial intelligence requires the Secretary of Homeland Security to “share information and best practices with AI developers and law enforcement personnel to identify incidents, inform stakeholders of current legal requirements, and evaluate AI systems for IP law violations.” Will this regulation prod government officials to pressure companies to remove legal content, or alter systems that are otherwise lawful? Almost certainly yes.

Courts might help to provide solutions, but judges will struggle to develop clear rules that establish bright-line boundaries between constitutional and unconstitutional conduct. The legal doctrine focuses on when government behavior tilts toward “coercion” or “persuasive encouragement,” rather than mere “coaxing.” Perhaps brilliant judges and legal scholars will be able to devise elegant ways to draw those lines clearly and consistently, but from a practitioners’ standpoint, line drawing seems fraught and challenging, as Katie Harbath and I argued in an essay we wrote for the Knight Institute’s convening.

The difficulty of this line drawing suggests that there’s merit in considering procedural strategies for mitigating jawboning problems, alongside whatever doctrinal solutions courts develop. A procedural approach might curb much of the same behavior as a doctrinal solution, but without the inconsistencies and delays that come with reliance on the case-by-case adjudication of the judicial process.

The mundane nature of procedure could also help to counter the intensity of the political and personality dynamics that infuse the interactions between government officials and company executives. If the parties needed to follow a specific procedure to request the removal of content, then no matter how much a government employee might try to coerce, pressure, cajole, encourage, or persuade a company employee to leave certain content up or take certain content down, the answer would always be the same: “file a request.”

As Katie and I emphasize in our essay, procedures to govern jawboning could make government and companies more accountable, force government communication out of the shadows, and level power imbalances between company employees and policymakers. The suggestions we made in that essay include:

  • The Biden administration should publish an executive order outlining permissible conduct for government officials when they engage on policy issues with industry executives. This order should clearly delineate between education and influence.
  • Technology companies should establish a formal mechanism to report inappropriate jawboning.
  • The government should institute a firewall between those who engage with technology companies on national security or public health issues and those who handle regulatory and policy matters for the tech industry. 
  • Congress should pass legislation that authorizes data sharing with companies, including sharing from companies to the government and between companies and researchers. 
  • Companies should voluntarily share data, and conduct trainings on their tools, with government officials.
  • Companies should establish an intake system for the governments to make requests, such as a law enforcement portal. The system should accept requests only from recognized government officials and route those requests to appropriate company decision-makers. Companies should then aggregate data on government requests and publish it regularly in transparency reports.
  • Companies should diversify their internal decision-making processes, ensuring that a range of teams—not only public policy teams, but also safety, product, and business teams—are included in decisions about how to respond to government requests.

In making these process-based proposals, Katie and I neglected to mention one key thing: the tech sector’s long history of resorting to process as a mechanism for dealing with thorny policy and legal challenges. This post looks at two examples of this tactic: intellectual property claims and law enforcement requests for user data. It outlines how these processes work, their strengths and their weaknesses, and how we might use them as models to develop more effective guardrails for jawboning.

Intellectual Property Claims

The Digital Millennium Copyright Act (DMCA) establishes a notice and takedown process for claims of intellectual property infringement, imposing liability on platforms when they receive notice of infringement and fail to take action.

To ensure compliance with the DMCA, tech companies publish policies on their notice and takedown procedures, including how rightsholders can submit notices. They also outline a process for how alleged infringers can contest the notice, and may provide online forms to enable them to take action. Within companies, IP requests are typically handled by legal teams or by operational teams with defined escalation procedures for legal review in challenging cases.

As noted in more detail below, the DMCA has been the target of extensive criticism. Even so, under the DMCA, rightsholders, alleged infringers, and platforms all have a clear playbook, and they typically follow it. When there are disagreements, the parties can litigate, but the overwhelming majority of disputes are handled by the notice-and-takedown regime and never end up in court.

Companies typically publish data on IP-related takedowns in their transparency reports, and many companies submit data to Lumen, a third-party organization that publishes data on takedown notices.

Law Enforcement Requests for User Data

Another example is the process that law enforcement officials use to obtain data from companies. The Electronic Communications Privacy Act (ECPA) establishes limitations on what data companies can disclose and who they can disclose it to, and imposes liability on companies if they disclose data to law enforcement officials outside of these boundaries. By ensuring that companies face legal risk for unauthorized disclosures, ECPA has spurred a detailed, rigorous process that governments and companies use whenever law enforcement officials seek to obtain user data to aid with an investigation.

Most companies require governments to submit requests through formal channels and in writing. Companies often require officials to use a law enforcement “portal,” and restrict access to the portal to officials with a law enforcement email address (see: Google, Meta, and TikTok). Government officials must include specific information in the request, including the legal basis for making it. They may be able to get expedited access to a wider range of data in an emergency, but only if they can demonstrate that the request is in fact an emergency.

On the company side, the requests are processed by a dedicated law enforcement response team. There is a limited group of people authorized to disclose data to the government. There are usually detailed internal procedures for escalating difficult decisions, so that communications, policy, and legal teams have visibility into the decisions and are able to express their views. Companies track the requests, and many companies publish transparency reports with information about the number of requests they receive, the types of requests, and whether they provided data in response to it. Many companies commit to provide notice to users if they receive a request to provide data about them, unless they are legally prohibited from doing so.

A Process-Based Model for Jawboning

The tech sector’s experience with the DMCA and ECPA provides some guidance on what a process-based jawboning model might look like. Instead of exerting informal pressure on companies to change their content moderation practices, the government should use a formal procedure to submit content removal requests to platforms. This procedure should have three primary features.

First, we should limit the communication channels. The White House could issue an executive order to require government officials to use formal communication channels to submit requests to remove or restore content to platforms. The order could also specify which executive branch officials will have access to the channel.

Companies could create portals to receive these requests, and limit access to officials using a government email address. On the company side, a limited set of employees could receive and process the requests and use escalation procedures to route complex decisions to a broader set of relevant company officials, including members of the policy and legal teams.

Second, we should create legal liability for action taken outside of this process. It may not be possible to impose liability in the same way the DMCA and ECPA do, since the First Amendment limits the government’s ability to restrict platforms’ speech decisions. Even so, if an executive order specified the government officials that could make requests to platforms to remove or restore content and required those requests to be submitted through a formal channel, then any communication outside of that channel by an unauthorized official would be unlawful. To emphasize that this process should be consistent with the First Amendment and should not be used as a mechanism for government to control platform speech policy, the executive order could also emphasize that company responses to government content requests are voluntary. Companies’ terms of service would govern their content decisions, not the government’s determination of whether speech is “good” or “bad.” Companies would not be expected to acquiesce to all government requests.

Despite the limitations of the First Amendment, companies could also impose legal constraints on themselves, tying their own hands by including details on these jawboning procedures in their terms of service. Then, if they take action that deviates from the commitments they make, they’d expose themselves to liability under Section 5 of the FTC Act.

Third, companies and the government should commit to be transparent about jawboning. Currently, companies may struggle to track all removal requests, since so many of them happen informally and since so many company staffers field them. Establishing a formal communication channel would make it easier for companies to track the number of requests they receive and the action they take on each one. They could then publish the data in their transparency reports. Companies could also commit to provide notice to users when the government requests to remove their content.

In addition to company reports, the government could track data as well, and publish regular reports on the total number of requests it submits across platforms. Company-by-company transparency provides only a piecemeal view of government requests, since it tells the story only of companies that report and excludes all that don’t. A government report could illuminate a more comprehensive picture of government efforts to influence platform content.

A benefit of establishing a formal communication channel to govern government content requests is that it could be used solely for those types of requests, and leave the door open to the kinds of educational, informative interactions that are so important to preserve. If a member of Congress wanted to relay something she learned about how online content had affected a constituent, she could do that however she wanted: by reaching out to an executive at the company, by giving a speech, or by publishing an op-ed. But as soon as the nature of that communication shifted from relaying information to requesting action, the company’s hands would be bound. It could not take action unless it received the request through the content request portal.

Yes, But …

A process-based solution offers many benefits. It brings shadowy behavior into the light, it constrains decision-making by government and corporate actors, and it reduces power imbalances.

But as we have learned with ECPA and the DMCA, process-based solutions can also be woefully insufficient and frustrating. ECPA has received extensive criticism for making it too difficult for companies to respond to foreign government requests, even when a company has data that might help the government in a priority law enforcement investigation. It has also been criticized for exactly the opposite reason: it sometimes fails to protect users from data disclosure that compromises their privacy. Similarly, as noted above, the DMCA has been the target of criticism by both rightsholders (who claim it does not protect their creative works from infringement) and from users and rights groups (who claim that it enables large media companies to squash creativity, expression, and innovation).

Another weakness of this approach is that it might address only a small portion of the jawboning problem. Governments exert pressure to try to change specific decisions—like whether to allow one individual post to remain on a platform—but they also exert pressure to try to change platform policies and procedures. A process that is limited to individual requests to remove or restore content will be unlikely to alleviate pressure related to more systemic platform practices.

A process-based solution may have deeper flaws as well. So-called “internet referral units” have sprung up in several countries, aiming to use dedicated teams to identify alleged problematic content and then pressure companies to remove it. While these public referral units may be preferable to behind-the-scenes jawboning, Stanford professor Daphne Keller has noted “what is lost by turning toward these ‘voluntary’ mechanisms: legislation as a source of rules, and courts as a source of adjudication, for online speech.” Even if process is imposed as a way to guide and constrain government and company actors, a new channel for requests sidesteps the formal democratic channels that should govern decisions about online content.

Looking Ahead

We might prefer formal democratic processes like statutes, regulations, and court decisions to govern online speech, but whether or not we implement a process-based solution to jawboning, that goal is probably a fantasy. In the absence of a process-based solution, jawboning will continue in the shadows.

For that reason, despite the imperfections of a process-based solution, those flaws shouldn’t be fatal. Both ECPA and the DMCA are imperfect, yet despite these imperfections, they handle most cases reasonably well most of the time. They systemize company and government decision-making by imposing constraints that reduce the likelihood of ad hoc decisions motivated by power, politics, and personalities.

Neither does a perfect job of adjudicating the rights it seeks to balance, and hard cases can make it seem like the statutes are teetering on the edge of collapse. They are limited to individual requests, so rarely are responsive to broader concerns about government or platform practices. And for people concerned that IP law strikes the wrong balance between expression and property rights, or that privacy law strikes the wrong balance between privacy and national security, the processes might seem to entrench and exacerbate those concerns.

But these process-based approaches hold up complex, fraught ecosystems, and even if they’re held together with duct tape, they’re held together. Few people would describe themselves as cheerleaders of the DMCA, yet it has governed disputes about digital intellectual property issues for more than two decades. Similarly, a process-based system for submitting requests that prompt company action could provide a path for muddling through.

It’s hard to envision an alternate universe without ECPA and the DMCA, where we rely solely on judge-made common law to provide the rules to govern intellectual property and law enforcement disclosures, and government officials try to fill gaps in that law by cajoling company executives behind the scenes to influence decisions. But it seems likely that world would be fraught with uncertainty, with occasional judicial rulings providing limited guidance, and platforms and governments left largely to guess at what the right decisions might be. Without formal democratic norms to govern decisions, power would shape outcomes. To put it simply, that world sounds a lot like the jawbone-filled world we have today.