Children spend a lot of time online. Who should be protecting them? | Peter Cade/Getty Images
When does safety become censorship?
The latest onslaught of child internet safety bills is upon us as expected, and it may soon intersect with America’s ongoing culture war.
As more evidence emerges that internet platforms can harm children and either can’t or won’t do anything to protect their users, the government has understandably felt the need to step in. States are proposing and even passing laws that restrict what children can access online, up to banning certain services entirely. On the federal level, several recently introduced bipartisan bills run the gamut from giving children more privacy protections to forbidding them from using social media at all.
Some of them also try to control the content that children can be exposed to. That comes with another set of concerns over censorship, especially now that some administrations have politicized ideas about what’s appropriate for kids to see. We’re already getting a glimpse of what various factions in this country think the internet should look like. We might be getting a much better look soon.
A new federal push to protect kids online — that states would help enforce
Protecting children from online evils, real or imagined, is a tale almost as old as the modern internet. Some of those fears, we’re increasingly learning, are not unfounded. Recent studies say that kids’ mental health is at crisis levels, and social media is often pointed to as a major contributor to that. Facebook whistleblower Frances Haugen’s 2021 revelations that the company hid research that said its services hurt teens’ mental health — claims that the social media giant says are inaccurate — are also cited as a major motivating factor for the legislative action we’re seeing now.
That action most recently took the form of the Kids Online Safety Act, or KOSA, which was reintroduced on Tuesday. Cosponsored by Sens. Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT), this legislation would require platforms to implement several safeguards for users under 18. The bill is controversial for a few reasons, one of which is a so-called “duty of care” provision. This would mean that covered platforms have to prevent kids from being exposed to content that promotes or could contribute to mental health disorders, physical violence, bullying, harassment, sexual exploitation, abuse, and drugs, among other things.
On its face, these seem like good things for children to avoid. KOSA’s proponents aren’t wrong when they say that social media platforms don’t just push potentially harmful content at children, but because these companies rely on keeping users’ attention however possible to power their business model, harmful content ends up finding its way to kids. Free speech and civil rights advocates, however, are wary of any legislation that tries to control content, no matter how well-meaning. So several such groups, including Fight for the Future, the Electronic Frontier Foundation (EFF), and the American Civil Liberties Union, have all come out against KOSA. At the same time, the bill already has the support of at least 30 senators from both sides of the aisle.
“We generally don’t like it when the government is trying to tell parents the correct way to parent their children,” said India McKinney, director of federal affairs at the EFF. “Yes, there is harmful stuff that happens online. That is absolutely true. But how do you define that in legislation, to make it clear what you mean and what you don’t mean, and in a way that platforms can [moderate]?”
The bill’s authors believe they’ve made it plenty clear in this latest version of KOSA, where definitions of harmful content are narrower and less open to interpretation than the previous congressional session’s version. For instance, “grooming” — which some on the right wing have adopted as their preferred term for pretty much any LGBTQ+ content — is no longer listed as an example of sexual exploitation. Along with about a third of the senate, KOSA has the support of many children’s health and safety advocacy groups. Also, Lizzo.
Bill Clark/CQ-Roll Call, Inc via Getty Images
Sen. Richard Blumenthal has made children’s online safety rules one of his major causes.
KOSA’s opponents aren’t just wary of its provisions about content. They also don’t like the power it gives to state attorneys general to enforce it. Some see this as an opening for state leaders fighting a culture war to go after online platforms that host speech about transgender rights, abortion care, or mention that gay couples exist. Or, really, any other content that’s become politically advantageous to censor and can be interpreted to fall under KOSA’s definitions, narrow as they are.
While it may have seemed like a stretch just a few years ago, this highly politicized version of kids’ online safety has become a reality to reckon with in the midst of the latest moral panic that some Republicans have made the center of their campaign strategies. Some of these attorneys general and the states they represent have pushed laws that ban books or public school curricula if they contain sexual, LGBTQ+, or race-related content. The laws are vaguely worded enough that libraries and schools are banning books preemptively just in case someone finds something objectionable in them. Some states are trying to or already have passed anti-trans laws that ban or restrict gender-affirming care for kids and even adults. They’ve even tried to ban drag shows.
Those states could conceivably do something similar to the digital world if given the chance. It’s not lost on some of KOSA’s opponents that Sen. Blackburn represents Tennessee, the state that tried to ban drag shows from being performed for or near children, or that she’s made several anti-gay and anti-trans comments and votes. We also know that platforms tend to over-moderate to ensure they can’t get in trouble, as we’ve seen some of them do to sex and sex-work-related content in the wake of FOSTA-SESTA. The end result is censorship, be it forced or voluntary.
A cautionary tale from state laws
Some recently enacted children’s online safety legislation shows us what state leaders want the internet to look like. These state laws pertain to children, but they impact adults, too.
A Utah law requires social media platforms to verify the ages of their users, which means people of all ages will likely have to submit some kind of verification to log into their social media account. The state passed another law that requires porn sites to verify visitors’ ages, which has prompted several porn sites to block Utah IP addresses entirely, saying it wasn’t possible for them to verify ages as the new law required.
Louisiana also banned children from visiting porn sites and requires those sites to verify visitors’ ages by proving their identities. While Pornhub implemented an age verification system to comply with the law, it noted that Louisiana-based traffic decreased by 80 percent after it went into effect. And sure, it’s possible that that 80 percent was all children who could no longer access the site. It’s more likely that it was adults who could view that content legally but didn’t want to upload their IDs to be able to do so.
Meanwhile, Arkansas passed a law that requires users under 18 to get parental consent to use certain social media platforms (it’s so far unclear how ages will be verified). California has the Age-Appropriate Design Code, which requires online services to implement certain design features for younger users and limit the data that can be collected on them. Montana passed a law that would ban TikTok entirely, which isn’t exactly a child safety law but does very much affect children, with whom the platform is very popular. It has yet to be signed into law by the governor. The list of other states considering children’s online safety bills goes on and on.
Federal legislation for kids’ online safety is much less likely to be passed than the state versions, as Congress is more divided and moves more slowly than many state legislatures. But there are bipartisan bills that have some potential — and, critics say, problems.
Along with KOSA, there’s EARN IT, which passed out of committee on Thursday, setting it up for a vote in the Senate (the last two incarnations of EARN IT similarly passed out of committee, but never got a floor vote). Supporters say it will help law enforcement better fight child sexual abuse material. Opponents fear that EARN IT will be used to weaken or ban encryption for everyone. The Protecting Kids on Social Media Act, introduced last month, bans children under 13 from using social media and requires parental consent for children 13 and over. That would prevent children from seeing social media’s harms, but it would also keep them away from online resources that do some good.
And then there’s the sequel to the Children’s Online Privacy Protection Act, a 1998 law that gave children under 13 certain privacy rights and remains the only federal consumer online privacy law we have, even decades later. The Children and Teens’ Online Privacy Protection Act, or COPPA 2.0, was introduced on Wednesday by Sens. Bill Cassidy (R-LA) and Ed Markey (D-MA). Markey was also behind the original COPPA. As a privacy bill, COPPA 2.0 doesn’t have the same content moderation issues that other bills do, but Markey has had a hard time getting it passed in previous sessions. And it stops short of giving privacy protections to adults, which privacy advocates, understandably, very much support.
Any law that covers people regardless of age, critics of these kinds of bills often point out, would take away the need to verify users’ ages — which can be a privacy violation in and of itself. Many of Louisiana’s porn-enjoying adults can probably attest to that. It could also solve or ameliorate some of the children’s safety issues without the need for problematic child-specific safety laws. But Congress so far hasn’t come close to passing that kind of privacy law after years of trying, so it seems unlikely that it will anytime soon.
Children’s online safety measures have been proposed and debated for decades, but they rarely went much further than that. Now, the threat that these ideas become law is very real, in part because the dangers online platforms present to kids are very real. But so is the possibility that kids’ online safety laws could be weaponized to censor content according to subjective and politicized views of what’s harmful. We’ve already seen what those views can do to school libraries. We may soon see what they’ll do to the internet.
A version of this story was first published in the Vox technology newsletter. Sign up here so you don’t miss the next one!