← Browse

Artificial Intelligence (AI) in Federal Election Campaigns: Legal Background and Constitutional Considerations for Legislation

Artificial Intelligence (AI) in Federal Election Campaigns: Legal Background and Constitutional Considerations for Legislation
Updated August 17, 2023 (IF12468)

Introduction

Federal campaign finance law does not specifically regulate the use of artificial intelligence (AI) in political campaign advertising. As technology continues to evolve, concerns have grown regarding the use of AI-generated campaign ads and their potential to spread misinformation. At the same time, there are questions about whether regulation of such ads would run afoul of the First Amendment. This CRS In Focus discusses provisions of federal campaign finance law that may be relevant should Congress consider regulating AI-generated campaign ads. It then discusses pivotal Supreme Court rulings on campaign finance law and constitutional considerations for possible legislation. For a related policy discussion, see CRS product, Artificial Intelligence (AI) and Campaign Finance Policy: Recent Developments, by R. Sam Garrett.

Federal Campaign Finance Law

The Federal Election Campaign Act (FECA or Act), codified at 52 U.S.C. §§ 30101–30146, does not specifically regulate the use of AI in political campaign ads. Two FECA provisions, however, may be relevant to this issue: the prohibition on fraudulent misrepresentation of campaign authority and the requirement of disclaimers, which are statements of attribution that appear directly on certain campaign communications.

FECA Prohibition on Fraudulent Misrepresentation of Campaign Authority

FECA prohibits a federal office candidate, including employees and agents of such a candidate, from fraudulently misrepresenting another candidate or political party "on a matter which is damaging to such other candidate or political party." The Act further prohibits anyone from fraudulently soliciting campaign contributions whereby the solicitor misrepresents that he or she is fundraising on behalf of a candidate or party. 52 U.S.C. § 30124.

On August 16, 2023 the Federal Election Commission (FEC) published a petition for rulemaking to amend its regulation on fraudulent misrepresentation of campaign authority, 11 C.F.R. §110.16, to clarify that the related statute, 52 U.S.C. § 30124, applies if "candidates or their agents fraudulently misrepresent other candidates or political parties through deliberately false AI-generated content in campaign ads or other communications." It approved the petition on August 10, 2023. (On June 22, the FEC had discussed, but not approve, a similar petition.)

FECA Disclaimer Requirements

FECA requires that any public political advertising financed by a political committee—including candidate committees—include disclaimers. FECA and Supreme Court precedent define political committee to include "any committee ... or other group of persons that receives contributions or makes expenditures aggregating in excess of $1,000 during a calendar year" whose major purpose is to elect federal candidates to office. 52 U.S.C. § 30101(4); see Buckley v. Valeo, 424 U.S. 1, 79 (1976). FECA further defines contribution and expenditure as monies or anything of value "for the purpose of influencing any election for Federal office." 52 U.S.C. § 30101(8), (9).

For radio and television advertisements by candidate committees, FECA generally requires that the communication state who financed the ad, along with an audio statement by the candidate identifying the candidate and stating that the candidate "has approved" the message. In the case of television ads, the candidate statement is also required to be conveyed by an unobscured, full-screen view of the candidate making the statement or, if the candidate message is conveyed by voice-over, accompanied by a clearly identifiable image of the candidate, along with a written message of attribution at the end of the communication. 52 U.S.C. § 30120(a).

In addition, regardless of the financing source, FECA requires a disclaimer on (1) communications that expressly advocate for the election or defeat of a clearly identified candidate, (2) electioneering communications (defined to include broadcast ads that refer to a clearly identified federal candidate that are run 60 days before a general election or 30 days before a primary), and (3) public communications that solicit contributions. These communications can include ads financed by outside groups, corporations, or labor unions. For such ads, FECA generally requires that a disclaimer clearly state certain contact information of the entity that paid for the communication and that the communication was not authorized by any candidate or candidate committee. In radio and television advertisements, such disclaimers are required to include, in a clearly spoken manner, an audio statement saying who is responsible for the content of the advertising. In television ads, the statement is required to be conveyed by an unobscured, full-screen view of a representative of the entity paying for the ad, in a voice-over, along with a written message of attribution at the end of the communication. 52 U.S.C. § 30120(a), (c), (d).

Effective March 1, 2023, the FEC promulgated new regulations that broaden the disclaimer requirements for public internet communications. Previously, the regulations generally required disclaimers on public communications—defined to include ads that are "placed for a fee on another person's website"—that were made by political committees, contained express advocacy, or solicited campaign contributions. The new regulations specify that this requirement also applies to "communications placed for a fee on another person's ... digital device, application, or advertising platform." 87 Fed. Reg. 77467–77480 (Dec. 19, 2022).

Regardless of whether a campaign communication is created with AI, FECA's disclaimer requirements would apply as discussed. However, the Act does not require such disclaimers to indicate that the ad was created with AI.

FECA Penalties

In addition to a series of civil penalties, FECA sets forth criminal penalties for knowing and willful violations of the Act. Generally, FECA provides that any person who knowingly and willfully violates any provision of the Act that involves the making, receiving, or reporting of any contribution, donation, or expenditure of $25,000 or more per calendar year shall be fined under Title 18 of the U.S. Code, imprisoned for not more than five years, or both. If the amount involved is $2,000 or more per calendar year, but less than $25,000, the Act provides for a fine or imprisonment for not more than one year, or both. Should Congress amend FECA to regulate AI-generated campaign ads, unless otherwise provided in the legislation, FECA's civil and criminal penalties would apply.

Constitutional Considerations for Legislation

In the 118th Congress, legislation has been introduced that would regulate AI in federal election campaigns. For example, H.R. 3044 and S. 1596, which are companion bills, would amend FECA's disclaimer requirements to require additional disclaimers. Specifically, for an ad that contains an image or video generated, entirely or in part, by AI, the legislation would require the ad to include a statement indicating that fact.

Should Congress consider legislation to amend FECA establishing an AI disclaimer requirement, the Supreme Court's campaign finance jurisprudence may be relevant in evaluating the constitutional bounds of such legislation. For example, the Court upheld the facial validity of FECA's disclaimer requirements against a First Amendment challenge, determining that the disclaimer requirements "bear[] a sufficient relationship to the important governmental interest of 'shedding the light of publicity on campaign financing.'" McConnell v. FEC, 540 U.S. 93, 231 (2003). Similarly, the Court upheld FECA's disclaimer requirements as applied to a film regarding a presidential candidate and related promotional broadcast ads. Quoting Buckley and McConnell, the Court in Citizens United determined that while disclaimer requirements may burden the ability to speak under the First Amendment, they "impose no ceiling on campaign-related activities" and "do not prevent anyone from speaking." According to the Court, FECA's disclaimer requirements "provid[e] the electorate with information" and "insure that the voters are fully informed" about who is speaking. Moreover, they facilitate the ability of a listener or viewer to judge more effectively the arguments they are hearing, the Court observed. Citizens United v. FEC, 558 U.S. 310, 368 (2010).

In McConnell and Citizens United, the Court applied a standard of "exacting scrutiny" that requires a substantial relation between the disclaimer requirement and a sufficiently important governmental interest. These precedents suggest that courts could uphold the constitutionality of a FECA AI-disclaimer requirement to the extent the government could show that the requirement furthers the informational interests of the electorate. However, it is uncertain whether courts will determine that notifying the electorate that an ad was created with AI is as sufficiently an important governmental interest as informing the electorate as to who financed or approved of an ad, as mandated by the current FECA disclaimer requirements.

Exacting scrutiny also requires a court to evaluate the burden on speech. As the Court appeared to rely on the fact that FECA's current disclaimer requirements did not prevent anyone from speaking, if such a requirement is so burdensome that it impedes the ability of a candidate or group to speak—for example, if a required disclaimer comprises a relatively long period of time in an ad—it could violate the First Amendment. Citizens United v. FEC, 558 U.S. at 366–71.

Possibly casting further doubt on the constitutionality of an AI disclaimer requirement, the Court recently invalidated a state disclosure law under a potentially more rigorous standard of exacting scrutiny that requires a "narrow tailoring" to a sufficiently important governmental interest. Americans for Prosperity Foundation v. Bonta, 141 S. Ct. 2373, 2389 (2021). While Bonta is not a campaign finance case, some lower courts have since applied this version of exacting scrutiny in cases challenging campaign disclaimer laws. In evaluating an AI disclaimer requirement under this potentially more rigorous standard, a court might be less likely to uphold the law. Nonetheless, some appellate courts have approved of campaign finance disclaimer laws under this narrow tailoring standard. See No on E v. Chiu, 62 F.4th 529, 533 (9th Cir. 2023) and Gaspee Project v. Mederos, 13 F.4th 79, 95–96 (1st Cir. 2021), cert. denied 142 S. Ct. 2647 (2022).

In contrast to a disclaimer requirement, it appears that courts would likely determine that a prohibition on AI-generated campaign ads is unconstitutional under the First Amendment. In evaluating a prohibition on certain campaign communications, the Supreme Court applied a "strict scrutiny" standard of review. Strict scrutiny requires the government to show that the law is the least restrictive means to achieve a compelling interest, which is a difficult standard to meet. Hence, applying strict scrutiny, the Court invalidated a FECA provision that prohibited corporations and unions from directly funding independent expenditures and electioneering communications. Citizens United v. FEC, 558 U.S. at 372. Accordingly, it appears that a prohibition on AI-generated campaign ads would likely be invalidated under a strict scrutiny standard of review unless the government could show that the law achieves a compelling government interest.