South Carolina Attorney General Alan Wilson announced that today, 42 attorneys general sued Meta in federal and state courts alleging the company knowingly designed and deployed harmful features on Instagram and its other social media platforms that purposefully addict children and teens. At the same time, Meta falsely assured the public that these features are safe and suitable for young users.
“Protecting our children is one of our most important jobs and that’s exactly what we’re trying to do with these lawsuits,” Attorney General Wilson said. “We can’t stand by and do nothing while Big Tech continues to engage in behavior that knowingly harms our children and breaks the law.”
Attorney General Wilson asserts that Meta’s business practices violate state consumer protection laws and the federal Children’s Online Privacy Protection Act (COPPA). These practices have harmed and continue to harm the physical and mental health of children and teens and have fueled what the U.S. Surgeon General has deemed a “youth mental health crisis” which has ended lives, devastated families, and damaged the potential of a generation of young people.
The federal complaint, joined by 33 states and filed in U.S. District Court for the Northern District of California, alleges Meta knew of the harmful impact of its platforms, including Facebook and Instagram, on young people. Instead of taking steps to mitigate these harms, it misled the public about the harms associated with the use of its platform, concealing the extent of the psychological and health harms suffered by young users addicted to the use of its platforms. The complaint further alleges that Meta knew young users, including those under 13, were active on the platforms and knowingly collected data from these users without parental consent.
While much of the complaint relies on confidential material that is not yet available to the public, publicly available sources including those previously released by former Meta employees detail that Meta profited by purposely making its platforms addictive to children and teens. Its platform algorithms push users into descending “rabbit holes” in an effort to maximize engagement. Features like infinite scroll and near-constant alerts were created with the express goal of hooking young users. These manipulative tactics continually lure children and teens back onto the platform. As Aza Raskin, the original developer of the infinite scroll concept, noted to the BBC about the feature’s addictive qualities: “If you don’t give your brain time to catch up with your impulses, . . . you just keep scrolling.”
Meta knew these addictive features harmed young people’s physical and mental health, including undermining their ability to get adequate sleep, but did not disclose the harm nor did they make meaningful changes to minimize the harm. Instead, they claimed their platforms were safe for young users.
In parallel complaints filed in state courts today, eight states have made similar allegations.
The multistate coalition that brought today’s complaint is also investigating TikTok’s conduct on a similar set of concerns. That investigation remains ongoing, and states have pushed for adequate disclosure of information and documents in litigation related to TikTok’s failure to provide adequate discovery.
States joining the federal lawsuit are Arizona, California, Colorado, Connecticut, Delaware, Georgia, Hawaii, Idaho, Illinois, Indiana, Kansas, Kentucky, Louisiana, Maine, Maryland, Michigan, Minnesota, Missouri, Nebraska, New Jersey, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Virginia, Washington, West Virginia, and Wisconsin. Florida is filing its own federal lawsuit in the U.S. District Court for the Middle District of Florida.
Filing lawsuits in their own state courts are the District of Columbia, Idaho, Massachusetts, Mississippi, New Hampshire, Oklahoma, Tennessee, Utah, and Vermont.