On December 19, 2022, the FTC announced two settlements with Epic Games, maker of the popular video game "Fortnite," including $520 million in civil penalties and other monetary relief relating to alleged violations of the Children's Online Privacy Protection Act (COPPA) and the FTC Act. As part of a proposed federal court order filed by the Department of Justice on behalf of the FTC, Epic agreed to pay a record $275 million civil penalty for allegedly violating the FTC's COPPA Rule, the largest penalty ever obtained by the FTC for a rule violation. Under a separate administrative order, Epic agreed to pay $245 million to refund consumers for its alleged use of "dark patterns" that caused players to make unintentional purchases. The settlements were voted out by a unanimous, bipartisan 4-0 vote, with Commissioner Christine Wilson filing a separate concurring statement.

The settlements are notable for more than just the sizable monetary penalties. The FTC's privacy order for the first time adds heightened privacy obligations for teens between 13-17 years old (who are not covered by COPPA, which is limited to children under 13). It also includes the first-ever charges relating to unfair default privacy settings, focusing on the public-by-default nature of audio communications within the Fortnite video game environment that led to "matchmaking children and teens with strangers while broadcasting players' account names and imposing live on-by-default voice and text communications." All businesses that offer child-directed online services, or who know that kids and teens are using their platforms, should revisit their privacy compliance in light of several key takeaways from these landmark settlements.

Privacy Violations

According to the FTC's complaint, Fortnite is a "child-directed" service that must comply with COPPA, which would require Epic to obtain parental consent before collecting or using personal information from children under the age of 13. After analyzing the game's visual style, gameplay, and features, the FTC alleged that Epic was aware (via news reports and internal studies) that children made up a substantial portion of Fortnite's 400 million players. Nonetheless, for at least the first two years of Fortnite's operations, the FTC alleges that Epic failed to implement parental controls or "minimal privacy settings," and took no steps to obtain parental consent before collecting children's personal information. Moreover, Epic required parents to "jump through extraordinary hoops" to delete any personal information that was collected.

The FTC also alleged that Epic violated the FTC Act's prohibition against unfair practices by failing to protect private information of both kids and teens by default. This landmark unfairness count is important as it is both the first time the FTC has explicitly required heightened privacy protections for teens and the first time the agency has ever alleged that public-by-default privacy settings can be unfair. The complaint states that Fortnite's default settings enabled strangers to communicate via real-time voice and text chat, exposing children and teens to verbal and sexual harassment, threats, and other harms. Despite alleged warnings from company employees, Epic failed to adopt adequate safeguards to prevent this harm, such as implementing default settings that required users to opt-in for voice and text chat. According to Commissioner Wilson, "Epic Games knew that its products and/or services presented substantial risk of harm and did not take simple steps to address that risk."

In addition to civil penalties, the order requires Epic to adopt default settings that require opt-in consent before enabling voice and text communications for children and teens. It also requires Epic to delete all personal information collected from children in violation of COPPA, establish a privacy program that addresses the concerns outlined in the complaint, and submit to regular outside assessments, among other standard FTC terms.

Use of Dark Patterns

The FTC also filed an administrative complaint alleging that Epic used dark patterns and deceptive practices to trick gamers into making inadvertent purchases, resulting in millions of dollars in unauthorized charges. The FTC alleged that Epic did not require cardholder consent before charging credit cards, allowing children to make in-game purchases without parental consent, and intentionally hid its cancel and refund features. If parents or cardholders disputed these charges, Epic would allegedly lock the gaming account in question and users would lose access to their purchases.

Key Takeaways

While these settlements reiterate many of the common themes from previous privacy and billing-related consent decrees, there are a few key takeaways for businesses to consider:

  • Online services that are directed to children and teens should assess the risks of sharing personal information by default. The FTC will closely evaluate how an online service's default settings can cause injury to kids and teens.While the facts of the complaint are somewhat specific to Fortnite's gaming platform as a vehicle for audio communications, the FTC is likely to expand this analysis in future cases by focusing on special harms unique to kids and teens using online services (e.g., depression, compulsive uses, cyberbullying, and other harms).The case also sheds light on how the FTC will evaluate whether a service is "directed to children," and businesses will want to understand the factors the agency applies in this analysis.
  • Privacy settings should be easy to understand and clearly communicated to kids, teens, and their parents. The FTC's complaint describes internal company documents recommending that Epic Games only allow in-game audio chat as an opt-in feature but alleges that these recommendations were ignored for several years.When the company eventually implemented an opt-out toggle privacy setting for voice chat, the FTC alleges that users were not informed of the existence of this setting and that the company inconspicuously placed the control in the middle of a lengthy settings page. Companies should ensure that privacy settings are communicated clearly and effectively, and it is worth reviewing the FTC's order, which provides detailed guidance describing how Epic Games must disclose its privacy practices and obtain affirmative express consent before allowing minors to share their personal information or converse with other users.
  • Everything is now a dark pattern from the FTC's perspective. "Dark patterns," sometimes known as deceptive designs, have increasingly been the focus of recent FTC actions. Many of these cases restyle typical deception cases as involving "dark patterns," like a recent $100 million settlement with Vonage based on "junk fees" and a variety of practices that made it difficult for consumers to cancel recurring charges. Similarly, the Epic Games complaint is akin to several unauthorized billing cases that the FTC brought several years ago against various large online platforms, but is now described as a "dark pattern."

If you have questions about COPPA compliance or other federal, state, or international privacy issues, please contact Ben Rossen, Maureen Ohlhausen, or any other attorney from Baker Botts' privacy and data security practice.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.