← Browse

Online Age Verification (Part III): Select Constitutional Issues

Online Age Verification (Part III): Select Constitutional Issues
August 17, 2023 (LSB11022)

This Legal Sidebar is the third installment in a three-part series on efforts to require online services to verify the ages of their users. Part I provided an overview of elements common in age verification laws. Part II discussed constitutional principles that may be relevant in assessing the constitutionality of age verification laws. This part applies the principles discussed in Part II to the elements of age verification laws discussed in Part I and offers considerations for Congress on the issue of implementing federal laws imposing age verification requirements.

Constitutional challenges to age verification laws may raise several issues. Many of the questions regarding the constitutionality of age verification laws may concern whether such laws are sufficiently narrow to avoid inhibiting more speech than necessary. The degree of tailoring required may vary depending on whether a given law is content based or content neutral. In both circumstances a law's constitutionality would depend on several factors, including (1) the strength of the government's interest, (2) the amount of protected speech that the law directly or indirectly restricts, and (3) the availability of less speech-restrictive alternatives.

Whether a Law is Content Based or Content Neutral

As discussed in Part II, laws restricting speech are subject to different legal standards depending on whether the laws target speech based on its content. Age verification laws focused on pornography or "material that is harmful to minors" are likely content based. Whether laws targeting social media are content based may be a more challenging question. Supreme Court case law suggests that speaker-based restrictions on speech—for example, laws that target particular websites—are not per se content-based restrictions. Because social media websites can host a variety of material, a law that broadly defines social media may not regulate these sites based on the substance of material that they host. Laws with content-based exceptions—such as Utah's social media law, which includes numerous exceptions for websites that provide certain types of content—may be subject to First Amendment challenges on the basis of their exceptions even if the rest of the law is deemed content neutral.

A law that imposes requirements based on a website's target audience or user demographics may be more likely to be deemed content neutral, though this is not a foregone conclusion. California's CAADC requires age estimation for any "online service, product or feature likely to be accessed by children." In a brief arguing for a preliminary injunction to prevent enforcement of the CAADC, tech industry group NetChoice argues that the CAADC's age verification requirements are content based because they require a different degree of certainty in estimating a user's age depending on the risks posed by a particular website. Challengers may also argue that the law applies only to areas of the internet "likely to be accessed by children," which draws content-based distinctions based on how that term is defined in the CAADC. Proponents of age verification laws might counter that determining whether material is "likely to be accessed by children" does not require reference to the material's substantive message and therefore is content neutral under the Supreme Court's decision in City of Austin.

Identifying the Government Interest

Both content-based and content-neutral laws must advance a government interest—either a "compelling" interest for a content-based law or an "important" interest for a content-neutral one. The Supreme Court has affirmed that government has an interest in the welfare of children, but it has required governments to "specifically identify an 'actual problem' in need of solving," rather than relying on an "abstract" interest in protecting children. In a case involving a state ban on the sale of violent video games to minors, the Supreme Court determined that a "predictive judgment," without proof that violent video games had a negative effect on children, could not justify government regulation. Perhaps to fortify their legislation against constitutional challenges, some officials in states with age verification laws have stressed the research-backed harms their laws address—such as negative mental health effects among teens resulting from social media use.

Impacts on Protected Speech

Whether a law is narrowly tailored or not depends in part on the amount of protected speech impacted by the law. Age verification laws may impact the free speech rights of at least three groups: website operators, adult users, and minor users.

Speech Rights of Website Operators

Many websites affected by age verification laws—such as social media websites—host content that is provided by website users rather than the website itself. Website operators may argue that this hosting activity is itself protected expression and that age verification laws discourage hosting activity, both by discouraging operators from hosting particular content (such as content that the operator believes may be "harmful to minors") and by discouraging operators from hosting content altogether if the costs of implementing age verification are too high.

Whether hosting third-party content online is protected by the First Amendment's Free Speech Clause is a continuing topic of debate in federal courts. Two appellate court decisions in cases challenging state social media laws reached opposite conclusions on this question. As discussed in this CRS Legal Sidebar, the Eleventh Circuit held that a social media platform engaging in content moderation exercises "editorial judgment" subject to First Amendment protection. By contrast, the Fifth Circuit held that social media platforms do not exercise editorial judgment, and that content moderation decisions standing alone are not protected speech.

The Supreme Court has not directly addressed this question, though in a recent decision the Court held that a website designer engages in protected speech when designing a website, even when the website incorporates third-party material. The Supreme Court's decision involved a designer engaged in custom website design work, which may distinguish the designer from an operator (like a social media platform) that provides an "ordinary commercial good" on the same terms to all users. The two cases challenging state social media laws have both been appealed to the Supreme Court, so the Court may choose to address the issue in more detail.

Speech Rights of Adults

Much of the material targeted by age verification laws is protected speech when accessed by adults. With respect to pornography, sexual content that depicts adults but is not legally obscene is protected speech with respect to adults even if it might qualify as speech "harmful to minors." With respect to social media, the Supreme Court has recognized that social media enables individuals to "engage in a wide array of protected First Amendment activity."

A law may burden adult speech even if it specifically targets material accessed by minors. The Supreme Court's decision in Reno struck down the Communications Decency Act (CDA) primarily on the basis that the law would impermissibly burden adult speech. The reasons for believing the CDA would burden adult speech may apply to contemporary age verification laws. The Reno court determined that the CDA's ban on transmitting indecent material to minors would burden adult speech "in the absence of a viable age verification process," because distributors of material would fear liability for transmitting material to minors. The Court also observed that a website operator's decision to adopt age verification may block adults from lawful content if the adults lack material required for verification, such as a credit card.

Lower courts have suggested that age verification may further burden adult speech by deterring adult users who are not willing to provide identifying information to access potentially embarrassing content. In a different context, the Supreme Court held that a requirement that cable television operators block sexual programming unless a viewer requests access to the programming in writing would "restrict viewing by [cable] subscribers who fear for their reputations" should their request be made public.

Speech Rights of Minors

Minors, like adults, possess free speech rights under the First Amendment. The Supreme Court has repeatedly held that, except in "relatively narrow and well-defined circumstances," government has no more power to restrict speech for minors than it does for adults. Laws that target social media websites may fall outside these "narrow" circumstances. The Supreme Court has struck down other laws that attempted to restrict the dissemination of protected speech to minors, including laws involving violent video games and movies with nudity. Social media allows minors to access a broad array of protected speech, meaning a law restricting minors' access to social media may have a greater impact on minors' speech rights than narrower laws the Supreme Court has previously struck down.

Pornography age verification laws may also impact minors' access to constitutionally protected material. State laws that seek to mandate age verification for pornography often apply to "material harmful to minors," a term that tracks the language used by the Supreme Court in Ginsberg v. New York and Miller v. California. Although the Supreme Court has upheld restrictions on physical distribution of material harmful to minors, federal appellate courts have raised issues with such restrictions as they might apply on the internet. For example, the Third Circuit held in multiple decisions that COPA's definition of "material that is harmful to minors" was unconstitutionally vague and overbroad.

The Miller definition of obscenity depends on whether "the average person, applying contemporary community standards" would find that the material in question "appeals to the prurient interest." Definitions of material "harmful to minors" incorporate this language and specify that the question is whether someone "applying contemporary community standards" would find the material "appeals to the prurient interest" of minors. In cases that involve physical distribution of offensive material, the relevant "community standards" are those of the material's recipient. The Third Circuit observed that applying a "contemporary community standards" requirement to internet communications, which are typically available worldwide, would subject all material on the internet to the standards of "the most puritanical communities." The Supreme Court has not decided how to apply "contemporary community standards" to internet communications, beyond concluding that the use of "contemporary community standards" alone did not render COPA unconstitutional. Some Justices have expressed support for a nationwide "community standard," while other Justices have suggested that the standards should depend on where material is received, as is the case with laws that do not involve the internet.

Efficacy and Alternatives

Whether a law is "narrowly tailored" depends on how effectively the law accomplishes its legislative purpose and the availability of less restrictive alternatives. The Supreme Court has looked to both factors. Addressing the effectiveness of age verification in 1997, the Reno court observed that the government had not proffered evidence that age verification would "actually preclude minors from posing as adults." In 2004, in Ashcroft, the Court observed that promoting the use of blocking and filtering software would be "less restrictive" and "likely more effective as a means of restricting children's access to materials harmful to them" than criminalizing distribution of harmful material.

The internet looks very different in 2023 than it did in 1997, or even 2004. How changes in technology might impact a court's narrow tailoring analysis—potentially in ways that distinguish age verification laws from laws previously declared unconstitutional—may be the biggest open question in assessing the constitutionality of new laws. If age verification technology has grown more effective, courts may be more willing to accept that requiring age verification can further a government interest in protecting minors. Likewise, if age verification solutions have become cheaper and more widely available, adopting such solutions may place less of a burden on website operators. With respect to alternatives, federal and state lawmakers introducing age verification legislation have attempted to stress the shortcomings of available alternatives. For example, a bill from the 117th Congress that would have required age verification for certain websites discussed at length the ineffectiveness of blocking and filtering software to prevent minors' access to pornography.

Age verification laws may be less restrictive than the CDA or COPA. The Reno Court listed potential means of narrowing the CDA, including "making exceptions for messages with artistic or educational value, providing some tolerance for parental choice, and regulating some portions of the Internet—such as commercial Web sites—differently from others." State age verification laws have incorporated these means to varying degrees. For example, Louisiana's pornography age verification law applies only to commercial entities and covers only material that "lacks serious literary, artistic, political, or scientific value for minors," and Utah's social media age verification law allows minors to use social media with a parent's consent.

Considerations for Congress

State age verification laws are already facing legal challenges in court, and any federal effort to impose age verification requirements on website operators would likely face First Amendment challenges as well. A law's ability to withstand challenges will depend principally on whether the law is content based or content neutral and how narrowly tailored the law is. Even if a law is content neutral, it may pose significant First Amendment issues. For example, a law that targets social media may be less clearly content based than a law that targets pornography, but minors likely have a greater interest in accessing social media than in accessing pornography. Questions of narrow tailoring will likely concern how much protected speech an age verification law affects and how effective age verification is at achieving its legislative purpose in comparison to alternatives. The changing technological landscape might distinguish modern age verification requirements from similar laws introduced more than 20 years ago.

In addition to technological alternatives, courts may consider legislative alternatives, including some that Congress has already enacted. The Children's Internet Protection Act conditions federal funding for schools and libraries on the installation of blocking and filtering software, and the Supreme Court held that this law does not violate the First Amendment. Another law, the Children's Online Privacy Protection Act, prohibits website operators from collecting children's personal information without verified parental consent. There are no reported cases considering whether this law complies with the First Amendment. Congress may consider whether these approaches could sufficiently address the harms that legislatures have sought to address through age verification because these alternative approaches might raise fewer constitutional concerns.