Protecting Children Online: Evaluating Possible Reforms in the Law and the Application of COPPA

The tension between protecting online private information and allowing online information exchange vital to digital commerce has resulted in continuing debates about privacy legislation in the United States. Some of the concerns apply to children’s privacy, even though the Children’s Online Privacy Protection Act (COPPA), enacted by Congress in 1998, provides a better defined framework for children’s privacy than exists for online privacy in general.

For the first 20 years since it was enacted, COPPA was relatively effective in protecting children’s privacy without significantly hampering the growth of online services. Recently, however, COPPA has limited the ability of online service providers to use behaviorally targeted advertising, particularly in videos watched by children. The FTC’s 2019 consent agreement with YouTube effectively prohibits behavioral advertising to children, while also making it harder for content providers to earn revenue from some content targeted primarily to adults. In general, attempts to strengthen privacy protection for children and teens, whether by legislation or FTC regulation, often do more harm than good. Privacy regulation that protects children and teens should consider the benefits of online information sharing and behavioral advertising and pursue rule changes that prevent or discourage real harm in a cost-effective way.

This policy brief begins by considering the rationale for enacting COPPA. Then it describes the early enforcement of COPPA and its definition of personal information. The FTC’s more expansive definition of personal information in 2013, combined with the YouTube consent decree in 2019, has been used to restrict online advertising more than is necessary to protect children from privacy-related harms. Furthermore, legislation has been recently proposed or enacted to expand the application of COPPA to teens. This paper explores and evaluates possible directions for future policy in light of tradeoffs between the quality of online services, free speech rights, and children’s privacy.

The Purpose of COPPA

COPPA “was designed to protect children from online threats by promoting parental involvement in a way that also preserved a rich and vibrant marketplace for children’s content online.” There is some debate as to whether it was intended to limit online advertising to children, though several who played a role in getting it enacted would say that it was. Others, citing comments made during the hearings that led to the passage of the bill, argue that the focus of Congress was on child safety and that the concern expressed about online marketing practices “was driven, at the root, by concerns over child safety.”

COPPA applies to children under 13 and requires that websites “obtain verifiable parental consent for the collection, use, or disclosure of personal information from children.” According to the statute, personal information includes first and last names, home address, email address “or other online contact information,” phone number, Social Security number, and “any other identifier that” . . . “permits the physical or online contacting of a specific individual.” It also includes “other information that the website collects from a child and combines with one of these identifiers.”

One question the FTC had to answer when deciding how to enforce COPPA is: What is the statutory definition of “collection”? It is explicit requests for information, and it applies to sites that “enable children to make personal information publicly available . . . except where the operator deletes all individually identifiable information from postings by children before they are made public, and also deletes such information from the operator’s records.” For these kinds of sites, the FTC intended “to make parents gatekeepers over which sites their children join or participate in.” When defining what it means to “enable children” to disclose personal information, the FTC applied the statute broadly, requiring parental consent even for children’s one-on-one communications, such as sending an electronic greeting card to a friend.

The FTC was charged with enforcing COPPA, and early on it needed to outline how websites were to determine whether users were children under 13. It started by identifying websites directed to children and distinguishing them from general audience websites. The FTC did not permit collection of personal information from any users of these websites without obtaining verifiable parental consent. For general audience websites, the FTC specified that a controller must not collect data from users if it has “actual knowledge” that they are under 13.

Arguably, COPPA has been effective in protecting the online safety and privacy of children under 13 because of the unique circumstances of that age bracket and the kinds of sites that have evolved to serve them. Many child-oriented sites have limited functionality, and some smaller sites are fee-based. But COPPA has been less effective in protecting children who use sites that are not child-directed because of the difficulty of obtaining age verification. Site operators cannot count on children to honestly reveal their age if they are younger than 13, particularly if they think they must be 13 or older to use the site.

Changes to FTC Enforcement of COPPA

In 2001, before the FTC began its formal enforcement of COPPA, the staff searched websites directed to children, identified those that collected personal information, then determined whether they posted privacy policies and obtained parental consent. The FTC issued warning emails to those sites that appeared to have substantial compliance problems. In the early years, most COPPA enforcement focused on firms that collected names, addresses, phone numbers, or email addresses from children on their websites.

In 2010, the FTC held roundtables to review COPPA. By that time, a substantial percentage of children had their own cell phones, many of which could be used to access the internet. This made it easier for children to access a wide variety of websites without parental supervision. Participants of the roundtable raised a number of concerns, one of which was the problem of determining the age of website users. Websites that tried to grant access by age have run into a problem of children under 13 lying about their age. There is no foolproof way to verify the age of internet users, which limits the website operators’ ability to determine whether users are under 13. Major platforms such as Facebook, Messenger, and TikTok do not permit users under 13, but they also do not verify the age of their users. Apart from asking users their age, the only additional step is to ask to specify their birth year. The result is that each platform has many underage users.

In 2013, the FTC adopted final amendments to COPPA. In the amendments, changes to the definition of personal information were made to include, among other things, a persistent identifier “that can be used to recognize a user over time and across different websites or online services”—for example, a customer number held in a cookie, an Internet Protocol (IP) address, a processor or device serial number, or a unique device identifier. Classifying persistent identifiers as personal information was controversial. TechFreedom was critical of this and other COPPA changes, arguing for more permissive policies that were consistent with COPPA’s original goals, which include enhancing parental involvement. In addition to the expanded list of personal information that cannot be collected without parental consent, the amendments also added new methods that service providers can use to obtain parental consent, “including through electronic scans of consent forms, video conferencing, use of government issued identification” and debit cards.

In 2019, the FTC policy changed significantly following a settlement with YouTube for violating COPPA. In the 2013 Final Rule Order, the FTC identified YouTube and Facebook as general audience websites. In the 2019 FTC settlement with YouTube, the FTC stated it would no longer consider YouTube as a whole to be a general audience website. It was now the responsibility of content creators to classify the videos they post on YouTube, with the possibility of incurring a $42,250 fine if a video is directed to children and not so classified. This change increased the risks of content creators that provide videos largely watched by a general audience that may include some children.

In the settlement with YouTube, the FTC effectively prohibited behaviorally targeted advertising on child-directed websites. Contextual advertising displays ads related to the content of a website, whereas behaviorally targeted advertising displays personalized ads to each user based on data collected about them and their interests. Over 900,000 people have signed a petition asking the FTC to reconsider its settlement. The petition argues against shutting off personalized ads attached to children’s content, noting that without targeted advertising “quality family-friendly content will shrink” because content creators earn from contextual advertising between 10 and 40 percent of what they can earn from behaviorally targeted advertising.

Parents who do not want their children exposed to behavioral advertising have several options. For YouTube users, these include disabling behavioral advertising while viewing YouTube; subscribing to YouTube Premium, a paid alternative that has no advertising; or using ad blockers. Other websites have similar options. The FTC’s decision to consider persistent identifiers as personalized information effectively prohibits firms from using data linked to these identifiers for targeted advertising to children whose parents did not consent to data collection. Although the FTC could reconsider its decision in the future, even if it does not, there is nothing in COPPA that should prevent behaviorally targeted advertising to children whose parents give consent to collect data.

In its recent rulemaking, the FTC has indicated an openness to limiting or banning behaviorally targeted advertising, not just to children but to everyone. The FTC’s regulation of online privacy in general is based on its authority to police unfair and deceptive practices. But the FTC has never brought a case alleging that targeted advertising is unfair or deceptive.

Proposal for Expanding COPPA to Teens

Several bills were considered by the 117th Congress that would affect children’s privacy. The most prominent among them is the Kids Online Safety Act (KOSA), which had some bipartisan support. This bill is intended to make online environments safer for minors, defined as “an individual who is age 16 or younger.” Its most important impacts would be on adolescents ages 13–16 to whom COPPA does not apply. KOSA imposes a duty of care for any covered platform, including social media services, video games, video streaming services, messaging applications, and educational services that is or is likely to be used by minors. This means the platform “shall take reasonable measures in its design and operation of products and services to prevent and mitigate mental health disorders . . .; patterns of use that indicate or encourage addiction like behaviors; physical violence, online bullying and harassment of a minor; sexual exploitation . . .; promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol . . .” But the problem with imposing such a duty of care is that “whether or not you met your obligations is determined after something bad happened.”

KOSA emphasizes protecting adolescents, who are presumed to be particularly vulnerable in some ways. According to TechFreedom, several provisions of KOSA effectively mandate age verification. In addition to the duty of care, KOSA requires platforms to provide a “safeguard” that would “limit the ability of other individuals . . . in particular individuals aged 17 or over” to find or contact a minor. It is hard to imagine how platforms could comply with this safeguard, unless they can determine whether each particular user is a minor. The bill, recognizing that platforms must substantiate the age of their users to protect themselves from liability, requires “a study of the technology to verify user age at both the device and operating system level.”

Expanding COPPA to cover adolescents 13 and older is much more problematic than applying it to children under 13. This is because websites that children under 13 spend most of their time on are of limited interest to adults, whereas many websites that most interest adolescents also attract a large number of adult users. For this reason, the only way to verify the age of users 16 and under would be to verify the age of all users.

KOSA requires platforms to police content provided by third parties to adolescents that may be perfectly legal to offer to adults. So this would require age verification for many websites that are heavily used by adults. This in turn would restrict anonymous speech, which likely violates constitutional rights. The age verification necessary to implement KOSA is similar to that of the 1996 Communications Decency Act or the Child Online Protection Act (COPA), enacted in 1998—both of which were struck down in court. Furthermore, age verification poses privacy and security risks because it may require collecting sensitive information related to a person’s identity. Such information could make adolescents less safe if it gets into the wrong hands. And the costs of age verification may be substantial and would be particularly burdensome for smaller web platforms.

To “prevent and mitigate” all the harms listed by KOSA, platforms would have an incentive to employ broad content filtering, which likely would filter out some beneficial information. To limit their risk, some platforms may even ban minors from their services entirely. KOSA could be used by state attorneys general to purge the internet of speech that presents controversial views. KOSA could result in websites getting sued for anything bad that “happens that is even remotely connected to a website.” And KOSA is likely to reduce adolescents’ access to useful information and interaction that could help prevent some of the bad outcomes supposedly prevented or reduced by its provisions.

KOSA requires that platforms provide parents with tools to supervise their minor children’s use. It also requires platforms to provide notice to minors when the parental tools are in use. In addition, platforms are required to provide notices to parents of minors about policies and practices of the platform, information about how to access safeguards and parental tools, and warnings about when use of the platform poses “any heightened risks of harms to minors.”

Whether or not Congress enacts new privacy legislation like KOSA, recent FTC rulemaking may have some impact on how COPPA is enforced. The FTC noted that it is considering limiting targeted advertising to children or teenagers irrespective of parental consent. More generally, it raised several questions for public comment:

  • “Is parental consent an efficacious way of ensuring child online privacy”?
  • Should platforms that do not target children and teenagers be required to take steps to determine the age of their users and provide additional protections for minors?
  • Should new trade regulation rules set out clear limits on transferring, sharing, or monetizing children and teens’ personal information?

In addition to possible federal legislation or stricter regulation by the FTC, state legislation also affects privacy policy toward children and teens. The most important is the California Age-Appropriate Design Code Act, which was signed by the governor in September 2022 and will take effect in 2024. The law prohibits profiling a child (defined as “anyone under 18”) by default. Profiling is stated as “automated processing of personal information” for purposes such as “analyzing or predicting a user’s health, economic situation, interests or behavior.” This law will make it difficult to engage in targeted advertising to minors.

Some platforms have made changes in their practices as the federal government and state governments propose or enact increasing restrictions on them regarding collecting and using information gathered from children and teens. Google, for example, has recently changed how it advertises to teens and children, stating that “on Google Accounts of people . . . under 18, Google disables ad personalization.” In 2021, Facebook announced a policy to stop allowing advertisers from using their platform to show those under 18 ads based on their interests or online activity.

Media and any information that are accessible online have important developmental impacts on adolescents, both positive and negative. The challenge is to find the best way to limit their exposure to potentially harmful content and communication without overly restricting their access to beneficial content. Parents have traditionally been viewed as having a central role in governing exposure to information that influences the development of minors. But the law also recognizes that adolescents have certain rights to autonomy from their parents. And parents generally accept the idea that, during their teenage years, children should be granted the freedom to make their own decisions. The uneasy balance between parental authority and adolescent autonomy has influenced policy regarding online services. For example, before COPPA was enacted, Congress removed a provision that would have required websites to notify parents and give them an opportunity to prevent or curtail the collection of their adolescent children’s personal information. The provision was removed because of concerns about free speech rights.

Moving forward, any change in the age coverage of COPPA should be subject to congressional action. How the law should protect adolescents from potentially harmful online content and how it should empower parents are difficult to outline. But such decisions would better reflect the diverse views of the population if they are openly debated by Congress.

Conclusion

Privacy policy toward protecting children and adolescents should take account of the tradeoffs between the benefits of online firms collecting and processing personal information and the costs of those firms disclosing such information. But the costs of regulation are also important. To the extent that it is less costly to regulate children under 13, especially in a governmental regime that emphasizes parental consent, a strong case can be made for regulating online data collection affecting that age group. However, the same cannot be said for adolescents. Although some restrictions may be appropriate for adolescents, costs of age verification and enforcement make it much more difficult to treat this age group differently from adults.

Legislation and FTC policy toward privacy, especially children’s privacy, have become more restrictive in recent years. It is important to enforce restrictions on what is done with data collected from children and enhance parents’ authority to protect their children from potentially harmful online interaction. Nevertheless, the online ecosystem that enables content providers to rely on revenue earned from online behavioral advertising, based on collecting and processing data, has generated many mutually beneficial exchanges and is worth preserving. With sufficient parental involvement and legal safeguards, the fundamental elements of this ecosystem can be preserved so as to enhance the quality of content that is available and affordable to children on the internet.

Citations and endnotes are not included in the web version of this product. For complete citations and endnotes, please refer to the downloadable PDF at the top of the webpage.

Mercatus AI Assistant
Ask questions about this research.
GPT Logo
Mercatus AI Research Assistant
Ask questions about this research. Mercatus Chatbot AI More Details
Suggested Prompts:
Ask us anything. We use OpenAI's ChatGPT 4o base model to answer any question about Mercatus research.