Privacy In Focus®

Legislators in Minnesota are considering a bill that would prohibit social media platforms from using algorithms to target content to users under 18. This proposal is another in a series of federal and state legislative efforts that address the use of online technology in the context of children and teenagers. Indeed, President Biden raised the profile of similar efforts in his March 1 State of the Union address, calling for increasing privacy protections for children and a ban on targeting advertising to children.

Below, we provide summaries of the latest Minnesota bill, as well as an earlier U.S. Senate proposal, both of which focus on the use of algorithms in relation to children. Additionally, we explore a recent Congressional Research Service report concluding that content-based restrictions on children's use of the internet can present significant First Amendment concerns.

Companies active in digital media should pay close attention to these developments, as this type of legislation could significantly impact both existing and planned products and services, even if the primary audience extends beyond children.

Minnesota Is Considering a Law To Prohibit Certain Social Media Algorithms That Target Children and Teens

The Minnesota proposal – HF 3724 and its companion SF3933 – would prohibit any social media platform with more than 1 million users from using an algorithm to target user-generated content at any user under the age of 18. The bill defines social media platforms broadly, as any "electronic medium, including a browser-based or application-based interactive computer service, telephone network, or data network, that allows users to create, share, and view user-created content," but excludes "Internet search providers or email." The House bill defines algorithm as "software used by social media platforms to (1) prioritize content, and (2) direct the prioritized content to the account holder"; however, the Senate bill offers a different definition of algorithm: "a technical means of sorting posts based on relevancy instead of publishing time, in order to prioritize which content a user sees first according to the likelihood that they will actually engage with such content."

Both the House and the Senate bills in Minnesota create exceptions for algorithms that work to block "inappropriate or harmful content," and for software or devices that have parental or internal controls "to filter content for age-appropriate material." Federal, state, and local governments, and public and private schools, colleges, and universities would also be exempt. Notably, both bills would create liability for providers if a user whom the service knew or had reason to know was under the age of 18 received user-created content via an algorithm. In addition to damages, the proposal establishes a statutory penalty of $1,000 per violation. As of the date of this article, both bills have been approved in their respective committees.

In 2021, Senators Markey and Blumenthal Introduced the Kids Internet Design and Safety Act

On the federal front, Sens. Ed Markey (D-MA) and Richard Blumenthal (D-CT) introduced the "Kids Internet Design and Safety Act" in September 2021. This bill, which is currently awaiting action by the Senate Commerce Committee, would govern online platforms, defined as "any public-facing website, online service, online application, or mobile application which is operated for commercial purposes."

The bill is focused on digital marketing methods and would prohibit several design features in online platforms "directed to" or used by children under the age of 16 (if the platform knows or has reason to know the user is of that age). For example, platforms would not be allowed to use video auto-play, send alerts or messages to get a user to re-engage when not using the service, display the quantity of engagement or feedback from other users, "unfairly" encourage a user to share personal information, submit content, or spend more time on the platform, or provide visual badges or reward symbols for using the platform more often. Companies would also be prohibited from encouraging youths to spend money on the platform and could not facilitate a financial transaction without parental notification.

Further, the bill would ban the use of algorithms to present non-educational content about sexual material, promotion of physical and emotional violence, unlawful activities, or "wholly commercial content that is not reasonably recognizable as such" to users under 16. It would authorize the Federal Trade Commission (FTC) to determine whether such material is non-educational. Further, the bill would ban the use of algorithms to present several types of advertising to youths using the platform, including host-selling, "program-length advertisements," influencer marketing, and advertising of alcohol, nicotine, or tobacco. The bill would authorize the FTC or state attorneys general to conduct civil enforcement.

A Recent CRS Report Explores the Legal Complexities with These Approaches

The First Amendment issues with many aspects of these proposals are significant. The influential Congressional Research Service (CRS) noted in a recent report that Congress has long been interested in regulating the internet to prevent potential harms to children, but has been limited by the First Amendment. Historically, many of these efforts have focused on sexually explicit content, and the report summarizes the reasoning behind federal courts' determinations that portions of the 1996 Communications Decency Act and the 1998 Child Online Protection Act violated the First Amendment. Based on these precedents, CRS points out that "federal power over indecent, non-obscene material has been limited to control exercised over broadcast communications." CRS also notes that content-based restrictions on children accessing certain internet content are likely to run into constitutional issues that need to be considered at drafting. As lawmakers continue to target algorithms, they are likely to increasingly run into First Amendment precedent that limits their ability to restrict display of certain content to minors. 

Originally published April 2022

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.