Last year, Congress considered, but didn’t pass, the Kids Online Safety Act and failed to update the Children and Teens’ Online Privacy Protection Act. The Kids Online Safety Act would have required companies to undergo regular external audits of the risks their platforms create for minors, implement stronger privacy settings for minors and bear the burden of ensuring they mitigate foreseeable harms like posts boosting substance abuse, eating disorders or suicide.
There is plenty of evidence to indicate that time spent on social media is linked to increased depression, anxiety and self-esteem issues for kids and teens.
There is plenty of evidence to indicate that time spent on social media is linked to increased depression, anxiety and self-esteem issues for kids and teens. In some cases, cyberbullying and harassment have even been linked to children’s deaths. Congressional inaction has exacerbated the dangers faced by children and teens who use social media and led to a predictable vacuum.
Now more than three dozen states are seeking to fill that vacuum by suing Meta, the parent company of Facebook, Instagram, WhatsApp and Messenger. They accuse Meta of violating a federal privacy law and of violating their state consumer protection laws. Separate but similar lawsuits make nearly identical claims against Meta based solely on state laws.
California and dozens of other blue and red states have alleged that Meta has lied about the safety of its social media sites and therefore violated state consumer protection laws. Specifically, the states allege that the social media sites are products that are designed to “induce young users’ compulsive and extended use.” Essentially, the lawsuit claims that Meta’s platforms are designed to addict minors, and others, to their products (much like slot machines) and that once they are on those platforms, the algorithms present minors with dangerous and harmful content.
The states claim that Meta knows full well that its platforms are causing children harm. Two years ago, a former Facebook employee leaked research showing that using Instagram can directly harm teenage girls. The research specifically involved content seen to detrimentally affect girls’ body images and self-esteem. CEO Mark Zuckerberg responded by saying that the research had been misconstrued and that his products weren’t designed to promote harmful or angry content.
The plaintiffs have also claimed that Meta collects the personal data of its minor users, in violation of the federal children’s online privacy law. Meta’s guidelines provide that it collect from a minor's accounts only data that is “needed for their device to work properly.”
Ideally, minors would stay off of social media entirely, or for at least for the vast majority of their time. But we all know that isn’t going to happen. Social media sites are too tempting and ubiquitous. The next best option would be for companies to take real and concrete steps to implement safety and privacy protections for younger users. And while Meta has implemented some reforms, like increasing parental controls and age verification technology and removing certain sensitive content, there is more that can be done. Content moderation that quickly removes posts that support bullying, harassment or suicide should be more robust.
“We want this activity to stop using its misleading algorithms,” Nebraska Attorney General Mike Hilgers said at a news conference Tuesday. “We want to make sure it applies with the COPPA,” he said, referring to the Children’s Online Privacy Protection Act.
When children are facing real harm and private companies don’t appear to be remedying that harm fast enough, lawmakers should step in.
Not every problem requires a legislative fix. But when children are facing real harm and private companies don’t appear to be remedying that harm fast enough, lawmakers should step in. This is also a situation that cries out for federal regulation, because we need one uniform set of standards throughout the country. Simply by virtue of how the internet works, having a patchwork of laws that vary by state could present real administrative hurdles.
While neither the Kids Online Safety Act nor the Teens’ Online Privacy Protection Act was perfect (critics contend that the proposals would “increase surveillance and restrict access to information”), lawmakers should re-introduce them and mandate that social media companies limit the time and frequency with which certain users can use their platforms and, perhaps most importantly, change the algorithms for which content is pushed to younger users.
Because Congress hasn’t acted, the majority of our states are trying to use lawsuits to force Meta to make changes.
Meta will almost certainly respond to these suits by alleging that another federal law, the Communications Decency Act, and specifically Section 230 of that act, protects social media companies from being sued for content users post on their platforms. It is well past time for Congress to update this federal law, which was passed in 1996, when the internet was in its infancy. The states have tried to circumvent the protections granted by Section 230 by suing under state consumer protection laws and therefore not targeting the content of the posts on Meta’s platforms, instead taking issue with how those platforms are designed and how Meta has allegedly lied about their harmful effects.
We don’t know whether Meta’s defense will be successful, but we do know that in 1996, Congress didn’t and couldn’t predict the growth of social media companies and the accompanying dangers that have resulted. That’s why this section of the law must be updated to fit our current time.
Our children are facing a mental health crisis. Teen depression has doubled over the last decade. Is this solely the result of time spent on social media? Of course not. But the evidence makes it hard to escape the conclusion that social media is contributing to it. This is a situation that screams out for lawmakers to create a baseline of standards to protect the youngest members of our society from foreseeable harm while ensuring that children’s privacy is also protected. The details of such legislation may be complicated, but the reasons it’s needed are not. And no need to wonder whether President Joe Biden would sign such legislation. He has called for it in his State of the Union addresses for the past two years.