Travel

If Your Child Is Addicted to TikTok, This May Be the Cure

Over the past few years, hundreds of families and school districts around the country have sued big tech companies on the grounds that the hypnotic properties of social media popular with children have left too many of them unwell. Citing the promotion of the “corpse bride” diet, for example, and other practices around dangerous forms of weight loss, Seattle Public Schools filed a complaint in January arguing that platforms like TikTok and Snapchat have been so unsparing in their delivery of harmful material to students that the resulting anxiety, depression and suicidal ideation were interfering with the school system’s primary mission of educating children.

More recently, in October, the New York attorney general, Letitia James, along with top prosecutors from more than 30 other states, filed suit against Meta alleging that the company put features in place on Instagram and Facebook to intentionally addict children for profit.

Tech companies, claiming First Amendment protections, have sought to get these sorts of suits quickly dismissed. But on Tuesday, a federal judge in California issued a ruling to make that more difficult. In it, she argued that what concerned plaintiffs most — ineffective parental controls, the challenges of deleting accounts, poor age verification and the timing and clustering of notifications to ramp up habitual use — was not the equivalent of speech, so the suits under her review should be allowed to proceed.

Forty years ago, drunken driving was an epidemic, claiming the lives of young people, a seemingly unmanageable problem until a group of mothers committed themselves to pushing for laws that brought accountability. It was a pivotal moment in the modern history of public health, and, in the same way, 2023 is likely to be remembered as an inflection point in the health crisis surrounding social media.

In May came Surgeon General Vivek Murthey’s advisory — a “call for urgent action” — to develop policy around a practice that was eroding adolescent sociality and self-esteem, compromising sleep and sound body image. Both state and federal legislatures have tried to enact laws that would keep certain kinds of emotionally disruptive content out of vision.

If nothing else, these efforts have emerged as a space of détente in our otherwise forever culture wars; TikTok seems to ignite adult rage no matter where you stand on gender-neutral bathrooms or banning “Antiracist Baby.” A Senate bill introduced in the spring — the Protecting Kids on Social Media Act — which would require companies to verify the age of their users was sponsored by unlikely comrades, Chris Murphy, the Democrat from Connecticut, and the Arkansas Republican, Tom Cotton.

The problem with some of the proposed legislation is a focus on prohibition, which leaves interpretations of harm to the discretion of judges and regulators and, in turn, creates an open door to endless litigation. Montana provides the clearest case. In May the governor signed a law banning TikTok outright with the promise of imposing corporate fines if the app was found to be operating in the state. Immediately, both the platform’s parent company, ByteDance, based in China, and TikTok users themselves sued, maintaining that the law was unconstitutional.

New York has chosen to pursue a different path. State lawmakers, hoping to circumvent some of these obstacles and serve as a model for the rest of the country, have bound themselves to an emphasis on distribution rather than content, technical operation over matters of speech. Sponsored by Andrew Gounardes, a state senator from Brooklyn, two bills aim to implement several changes. First, they would require social media companies to restrict the use of predictive algorithmic features meant to keep users on a particular platform longer; second they would allow parents to block access to social media sites between midnight and 6 a.m.

The legislation comes with the very vocal support of Gov. Kathy Hochul and Ms. James. “We want all these social media apps to show kids only the content they want to see,” Mr. Gounardes told me. “If a parent decides otherwise, they can turn the algorithm on. But the default is that it would have to be off.”

Zephyr Teachout, the legal scholar who helped draft the legislation, saw precedent in the way that gambling is regulated. The algorithmic targeting is similar to the kind deployed by slot machines, which over and over supply the tantalizing lineup of oranges and cherries that just keep you pulling the lever, with the elusive jackpot in mind. Any form of online gambling, in fact, as Ms. Teachout pointed out, “involves the algorithmically determined type of content to be delivered, and in most states gambling is prohibited by those under 18.”

Were the law to come under Supreme Court review, a 2011 case in which it struck down a California law banning the sale or rental of violent video games to minors would probably emerge as a reference point. In that instance, even justices who agreed with the majority opinion pointed out that technology changed at high-speed pace, and different circumstances might require a more nuanced approach later on. “They put down a marker that is very relevant to this moment,” Ms. Teachout said. “They said that the court should not simply apply old standards to new and quickly evolving modes of digital media.”

The New York law has been constructed narrowly enough in its creators’ view that courts ought to recognize it as a critical response to a pervasive problem in which we all have a special responsibility. Facebook would operate as it did in its early iteration, when what you received in your feed was only what you had signed up to see. No one would be prevented from looking up whatever they wanted. “It’s just that you couldn’t open up a Taylor Swift page and five clicks later be shown a video of how to harm yourself,” Mr. Gounardes said.

Back to top button