quinta-feira, 26 de março de 2026

How Do We Protect Children from Becoming Addicted to Social Media?


My granddaughter began scrolling on a cellphone when she was around two years old, her index finger repeatedly swiping across its face as if she were already a teenager.

Some of the 18-year-olds in my classes at Berkeley seem to suffer withdrawal symptoms when I ask them to put away their phones.

I see young people in restaurants sitting with other young people, none saying a word to each other as they lose themselves in their devices.

Are they addicted? Yes, if you define addiction as getting such a dopamine rush that they feel compelled to use their cellphones for hours at a time.

How similar is this to a nicotine addiction? And — as was a central question thirty years ago when Big Tobacco was being sued — is Big Tech intentionally designing its product to hook young people?

The answer appears to be that the addictions are quite similar, and Big Tech is just as culpable as Big Tobacco.

On Wednesday, in California, a young woman prevailed in a lawsuit against social media giants Meta Platforms and Google’s YouTube, in which she accused them of designing their apps to be as addictive and harmful to adolescents as cigarettes. Jurors found the tech companies to be negligent in having failed to provide adequate warnings about the potential dangers of their products.

What seemed to persuade the jury were features that Meta and YouTube had built into their software like infinite scroll, algorithmic recommendations, and autoplay videos — designed to get young users to compulsively engage with the platforms.

Internal company documents from Meta and YouTube executives showed they knew of and discussed the negative effects of their products on children.

In fact, this case and many others likely to follow in its wake (more than 3,000 other similar lawsuits are pending in California courts against Meta, YouTube, Snapchat and TikTok) are rooted in the litigation against Big Tobacco thirty years ago, in which plaintiffs argued that the tobacco corporations created addictive products that harmed their users.

I’m old enough to remember when U.S. Surgeon General Dr. Luther Terry issued the first landmark report warning that cigarette smoking causes cancer and other diseases, on January 11, 1964. I was a teenager then, quietly debating with myself whether to look cool by having a cigarette dangling from my lips.

The report, entitled Smoking and Health: Report of the Advisory Committee to the Surgeon General, sparked a national shift in public health — leading in 1965 to mandatory warnings on cigarette packages. The report and the warnings, and the hullabaloo surrounding them, put me off smoking.

Almost sixty years later, in 2024, U.S. Surgeon General Vivek H. Murthy called for adding warning labels to social media, explaining that the platforms were associated with mental health harms for adolescents. She wrote:

“The mental health crisis among young people is an emergency — and social media has emerged as an important contributor.

Adolescents who spend more than three hours a day on social media face double the risk of anxiety and depression symptoms, and the average daily use in this age group, as of the summer of 2023, was 4.8 hours. Additionally, nearly half of adolescents say social media makes them feel worse about their bodies.

It is time to require a surgeon general’s warning label on social media platforms, stating that social media is associated with significant mental health harms for adolescents.”

She’s right. A surgeon general’s warning label would remind parents and adolescents that social media may be unsafe.

Evidence from tobacco studies show that warning labels can increase awareness and change behavior. When asked if a warning from the surgeon general would prompt them to limit or monitor their children’s social media use, 76 percent of people in one recent survey of Latino parents said yes.

But we shouldn’t stop there, and Big Tech shouldn’t be able to use warning labels as a defense to future lawsuits claiming social media addiction among young people.

Meta, YouTube, and other social media platforms must redesign their products to be less addictive to minors. Yet, as with Big Tobacco, they’re unlikely to do this unless liability judgments against them start mounting substantially.

In the 1998 Tobacco Master Settlement Agreement, the major tobacco companies agreed to pay over $200 billion to 46 states, 5 territories, and the District of Columbia to settle lawsuits over smoking-related health costs. The settlement imposed strict marketing restrictions and funded anti-smoking campaigns.

Now, cigarettes are prohibited in most workplaces and public spaces. The Food and Drug Administration restricts tobacco sales to individuals 21 or older. Additional rules under consideration target menthol flavors, and reduce nicotine levels.

We need to protect our kids from social media no less strictly. Why not ban children under 16 from using social media, prohibit its use in schools, and have Big Tech pay for anti-social media campaigns directed at young people?

In December, Australia issued a ban on young people using social media. Malaysia, Spain, and Denmark made similar rules. American children deserve no less.

A jury in New Mexico found Meta liable for failing to protect young people from online dangers, including sexually explicit content, solicitation and human trafficking, on Tuesday in the first trial of its kind and ordered the company to pay a $375 million penalty. Meta said it respectfully disagreed with the jury and would appeal the ruling.

By focusing on the design of Meta and YouTube’s apps rather than the content posted on them, Kaley’s case sought to get around longstanding legal protections, known as Section 230 of the Communications Decency Act of 1996, that have largely shielded social-media companies from being held liable for content on their platforms.

It is a bellwether case for thousands of similar lawsuits in California, which serves as a test of evidence to see how juries reacted. The outcome doesn’t bind other cases but could encourage settlements now that both sides have a better sense of how their arguments might play out with jurors.

During the seven-week trial, Kaley testified that she started watching YouTube videos at age 6 and made an Instagram account at age 9. She uploaded more than 200 YouTube videos before she turned 10 and created 15 Instagram accounts before she turned 15, Kaley and her lawyers said.

Lanier, Kaley’s lead attorney, said that on one day she spent 16 hours on Instagram.

“I wanted to be on it all the time,” she said. “If I wasn’t on it, I felt like I was going to miss out on something.”

Meanwhile, Meta spent hours in cross examination trying to convince jurors that Kaley’s struggles were caused by other factors, including a difficult family life and bullying at school. Meta lawyer Andrew Stanner said notes from six months of therapy appointments didn’t mention social-media addiction or name any social-media apps.

Jurors also heard from Meta executives Mark Zuckerberg and Adam Mosseri. Mosseri, the Meta executive tasked with running Instagram, testified that the app wasn’t “clinically” addictive, and Zuckerberg said his company’s goal was to give users something useful, not addict them.

“We used to give teams goals on time spent and we don’t do that anymore because I don’t think that’s the best way to do it,” Zuckerberg said when he took the witness stand in the second week of the trial.

On Tuesday evening, Meta introduced a new stock option program for senior executives to motivate them to grow the company at an extremely aggressive pace and reach a $9 trillion-plus valuation.

As the punitive-damages phase of the trial got under way in court on Wednesday, the plaintiff’s attorney, Lanier, revealed a jar of M&Ms—415 of them. Each one represented $1 billion of the $415 billion in total stockholders’ equity of Alphabet Inc., he said. As he removed M&Ms one by one, Lanier noted how the jar barely registered a change, symbolizing how inconsequential even a ruling of $1 billion in damages would be to a company of Alphabet’s value.

Luis Li, an attorney representing YouTube, began his remarks by apologizing to Kaley.

“We at YouTube truly, truly hope that there have been things about YouTube that have enriched your life,” he said. He then began delineating how YouTube has instituted features to interrupt scrolling and give parents an indication of how much time their children are spending on their phones.

“They’re not perfect,” he said, adding, “you can’t walk into someone’s phone and turn on all of these features.”

Paul W. Schmidt, an attorney for Meta, referenced testimony from former company employees who, while critical of the company’s practices, said they did not believe it was intentionally trying to harm young people.

“There wasn’t an intention to do harm,” he said.

Sem comentários: