As kids head back to school and lawmakers return to Washington in September, children’s online safety will be back on tech policy’s front burner. Politicians and advocates are reviving their push for new laws to protect kids online amid a string of child sexual exploitation lawsuits against Roblox and reports that Meta allowed its AI chatbots to hold “sensual” conversations with young people. But the courts could constrain policymakers’ options. Louisiana’s attorney general sued Roblox over reports that child predators used the platform to sexually exploit minors. The suit, filed Thursday, joins several others alleging Roblox didn’t do enough to stop kids being sexually exploited or groomed on the popular gaming platform. With more than 100 million active users, many of them under 18, Roblox is among the most popular destinations for kids online. It lets people build and share their own games and socialize while playing them. For years the company has faced criticism that its platform enables adult predators to connect privately with minors. A 2024 Bloomberg investigation into the site’s alleged “pedophile problem” helped to spark a flurry of lawsuits against the company. Matthew Dolman, managing partner of the Dolman Law Group, told Tech Brief he has filed six lawsuits against Roblox in the past 45 days and has many more in the works. “It’s been like an avalanche of communications we’re getting from concerned parents around the country,” he said. Dolman accused the company of prioritizing user growth and profits over children’s well-being. Louisiana’s lawsuit alleges that Roblox “knowingly enabled and facilitated” the abuse of children by failing to effectively implement safety measures such as verifying users’ ages. Roblox said it takes the problem seriously and is doing everything in its power to address it. “We share the critically important goal of keeping kids safe online and any assertion otherwise is categorically untrue,” Roblox spokeswoman Kadia Koroma said in an emailed statement. She said many of the user-created Roblox games highlighted in the Louisiana lawsuit were removed long ago for violating the company’s safety policies. Roblox has been an industry leader in developing ways to keep kids safe online, Koroma added. Last month, the company announced new safeguards on its private messaging features, including a plan to implement “age estimation” technology. Meanwhile, Sen. Josh Hawley (R-Missouri) is opening an investigation into Meta’s AI chatbot policies. Reuters reported on Thursday that an internal Meta policy document showed the company explicitly allowed its AI characters to “engage a child in conversations that are romantic or sensual.” The document, approved by Meta’s legal, public policy and engineering staff, including its chief ethicist, also allowed the chatbots to “generate false medical information and help users argue that Black people are ‘dumber than white people,’” Reuters said. Meta acknowledged the document’s authenticity and said it has since changed the guidelines around conversations with minors, calling them a mistake. “We have clear policies on what kind of responses AI characters can offer, and those policies prohibit content that sexualizes children and sexualized role play between adults and minors,” spokesman Andy Stone told Tech Brief. “Separate from the policies, there are hundreds of examples, notes, and annotations that reflect teams grappling with different hypothetical scenarios. The examples and notes in question were and are erroneous and inconsistent with our policies and have been removed.” Hawley, a longtime antagonist of Meta, called the guidelines “sick” and said Friday that he will be “launching a full investigation” into how they came about. Child safety advocates argue the Roblox and Meta revelations underscore the need for new regulations. “Companies like Meta and Roblox have made it crystal clear they are willing to allow — and even encourage — perversion, predation, and abuse on their platforms to line their pockets,” Sen. Marsha Blackburn (R-Tennessee), said in a statement to Tech Brief on Monday. “How many innocent children must fall prey to Big Tech’s exploitation before Congress finally sees fit to pass the Kids Online Safety Act?” Blackburn in May joined Sen. Richard Blumenthal (D-Connecticut) in reintroducing the Kids Online Safety Act, or KOSA. The bill would impose a host of new responsibilities on platforms that allow young users. It passed the Senate last year, but stalled in the House. Groups backing the law include the nonprofit National Center on Sexual Exploitation, previously called Morality in Media. “Roblox would be forced to find solutions to protect the children who use the platform if KOSA was passed,” said Haley McNamara, its executive director. KOSA isn’t the only online safety bill in the works — and it might not be Republican leaders’ top priority this fall. Despite its bipartisan backing, KOSA has drawn criticism from both right and left, and some lawmakers worry it could face First Amendment hurdles after a court struck down a California law with some similar provisions. In June, Senate Commerce Committee Chairman Ted Cruz (R-Texas) opted to advance a different kids’ safety bill, focused on data privacy, ahead of KOSA. There is a sense on the Hill that Republican leaders may be leaning toward a more piecemeal approach to kids’ safety legislation this term, with narrower bills targeting age verification and minors’ access to online pornography. Blackburn, who has led the charge on KOSA among Republicans, announced this month that she will run for governor of Tennessee, putting her Senate future in doubt. Laws requiring age checks by online platforms or app stores face their own legal hurdles. The Supreme Court in June paved the way for some forms of mandatory online age verification when it affirmed a Texas law requiring age checks for adult websites. Last week, the high court allowed Mississippi to begin enforcing a law requiring big social networks to verify users’ ages. Yet the court’s ruling was not the win for would-be regulators that it might seem. Writing in agreement with the ruling, Justice Brett M. Kavanaugh said the Mississippi law is “likely unconstitutional,” but the plaintiffs hadn’t shown enough evidence to justify an emergency order blocking it. Jennifer Huddleston, senior fellow in technology policy at the libertarian-leaning Cato Institute, said regulations intended to protect kids online tend to backfire. “Age verification and other one-size-fits-all approaches from policymakers will often result in serious speech and privacy concerns without fully understanding or addressing the problems,” Huddleston told Tech Brief via email. “Each child and each family has unique concerns and trade-offs when it comes to technology usage, and parents, not policymakers, are the best ones to determine what an appropriate solution can be.” |