Teen Crisis: How Facebook Targeted Tweens
Part 1, The Problem: An epidemic of depression, anxiety, and suicide among teenagers. Internal documents showing a company that knew and kept going. And a jury that held it accountable.

For this series we examine the youth mental health crisis — an epidemic of depression, anxiety, loneliness, and suicide among teenagers that public health officials now treat as an emergency. The data is stark, the timeline is clear, and the internal documents are now public. How it took this long — and whether we’re truly at a reckoning yet — is a story about power, money, and the law.
Today’s installment examines the problem — including a landmark jury verdict that may have changed everything. Next, the forces that got us here: the lobbyists, the regulatory vacuum, and the 26 words that made it possible. The third part explores solutions.
In November 2016, Mark Zuckerberg — CEO of Meta, the company behind Facebook and Instagram — made a decision. The company’s own data showed that younger users were exceptionally high-retention — far easier to capture, and far more valuable over a lifetime, than older ones. The directive went out in an email on November 7th.
“Mark has decided that the top priority for the company in H1 2017 is teens.”
Facebook is free. Instagram is free. But when a product is free, the user isn't the customer — the user is the product. The actual customers are advertisers, who pay Meta for access to human attention. The more attention Meta captures, the more it can charge. The longer someone stays on the platform, the more valuable they are. And the earlier in life you capture someone, the longer the runway — and the harder it becomes for them to leave.
Meta knew where to apply pressure. "Tweens," in the company's own language, were three times more likely to keep returning to Facebook than a 32-year-old. A 2018 internal document, titled "The Young Ones are the Best Ones," made the logic plain: “Tweens (approximate age 10–12) are special. People who join Facebook as tweens have the highest long term retention out of all age groups.”
There was one problem. Facebook's own terms of service required users to be at least thirteen. The tweens Meta was building its future around weren't old enough to be there.
Meta's internal documents show they decided it didn't matter. “If we wanna win big with teens, we must bring them in as tweens.”
They also knew what the product was doing to the children it was capturing. In one internal exchange, an employee described Instagram with four words: “IG is a drug.” A colleague responded without apparent alarm: “We’re basically pushers.” The conversation went on to compare the platform’s design to slot machines — systems “optimized to keep you engaged as much as possible.”
The exchange was not unique — similar language runs through the internal documents that would later surface in court. And for years, none of it left the building. Meta's public face was connection, community — bringing the world closer together, as the company's mission statement put it.
Many shared that belief when social media first arrived. The techno-optimism that greeted these platforms — the sense that connection was inherently good, that more of it could only be better — has since curdled into something darker.
It has undermined democracy, spread disinformation, gutted local news, and sowed division. But perhaps most alarming is what it has done to an entire generation of teenagers: an epidemic of depression, loneliness, and suicide that public health officials now treat as a crisis.
In 2024, the U.S. Surgeon General called the mental health crisis among young people an emergency and urged Congress to require warning labels on social media platforms — the same labels that go on cigarettes and alcohol. In December 2025, Australia became the first country to ban children under 16 from social media outright. Greece is planning similar action, and governments from Europe to Asia are watching to see if it works. More than a dozen American states have passed their own restrictions.
The question is no longer whether something went wrong. It is whether we can agree on what to do about it. This Solving For series will examine how it happened, why it took this long to reach a reckoning, and what's actually being done to fix it.

When the Lines Bent
To understand what went wrong, start with the numbers. For decades, rates of teen depression, anxiety, loneliness, self-harm, and suicide had held roughly steady.
Then social media arrived. Facebook in 2004. Twitter, now X, in 2006. Then the smartphone made it portable — the iPhone launched in 2007, the Samsung Galaxy in 2009. Suddenly, people carried social media everywhere.
Then, in 2012, the lines bent.
Depression came first into view. In the decade between 2013 and 2023, the share of high schoolers reporting persistent feelings of sadness or hopelessness jumped from 30 percent to 40 percent. Nearly 1 in 5 teenagers — about 4.5 million adolescents — had a major depressive episode in the past year. Among girls, the rate was 53 percent. Among LGBTQ+ youth, 65 percent. The trends were consistent across race, income, and geography. The sharpest increases appeared among girls and young women.
The loneliness data is, in some ways, even more striking. From 1991 to 2007, teen loneliness had actually been declining. Then, starting around 2012, it reversed — and accelerated. By 2019, 48 percent more teens felt lonely than had in 2011. The World Health Organization now identifies teenagers as the loneliest age group on earth — 1 in 5 report experiencing loneliness, a rate higher than any other demographic, including the elderly. And this was not an American phenomenon. A study of students across 37 countries found that school loneliness increased between 2012 and 2018 in 36 of them. Wherever smartphone access and internet use were high, loneliness was high.
The suicide data tells the same story, and the same timeline. The share of teenagers seriously considering suicide was around 16 percent before 2012. By 2023 it was 1 in 5 — and that understates the scope. Today, suicide is the second leading cause of death for Americans ages 10 to 34. In 2024, 2.6 million teenagers still had thoughts of suicide. Seven hundred thousand attempted it.
Researchers went looking for a cause. They ruled out the economy — the years between 2010 and 2015 were a period of steady economic growth and falling unemployment. They ruled out academic pressure — homework time barely moved. Income inequality had been widening for decades without producing a sudden break in the data. What had changed, suddenly and universally, was something else. Smartphone ownership crossed the 50 percent threshold between late 2012 and early 2013 — right when teen depression and suicide began to climb.

Rising Alarm, No Accountability
In 2019, Roger McNamee — an early Facebook investor and one of Zuckerberg's mentors — published Zucked, a first-person account of watching a platform he'd helped build become something he no longer recognized. In 2020, The Social Dilemma, a Netflix documentary featuring former engineers and executives from Facebook, Google, and Twitter, put the mechanics of the attention economy in front of a mass audience for the first time. It was watched by tens of millions of people in its first month. In 2024, social psychologist Jonathan Haidt published The Anxious Generation, which marshaled years of data to argue that social media and the smartphone had fundamentally rewired adolescent development — and that the industry had known, and had done nothing.
All of it — the books, the documentaries, the congressional hearings, the whistleblowers — struck a nerve. The Anxious Generation has been on the New York Times bestseller list for nearly two years. And yet none of it produced accountability. Meta and other social media companies marched forward, arguing publicly and in legislative testimony that the research was inconclusive, that correlation wasn’t causation, that they were committed to user safety.
Then came the litigation.
Thousands of families began suing Meta and other social media companies, alleging the deliberate design of addictive products targeting children. One of those cases, KGM v. Meta, went to trial in Los Angeles this year and lasted seven weeks. Jurors heard the internal documents read back to the company in open court — the emails, the slide decks, the employee exchanges about drugs and pushers and slot machines. And they heard from a young woman who had first logged on at six years old.
She is the plaintiff in the case, identified in court documents only by her initials — KGM. Her lawyers called her Kaley.
Kaley
She grew up in Chico, California, in a quiet cul-de-sac where her mother threw themed birthday parties and took her to Six Flags. She started using YouTube at six years old.
By the time she finished elementary school, she had posted 284 videos on YouTube. She was on Instagram by nine. As a child, she set up multiple accounts so she could like and comment on her own posts. She bought likes through a service where she could like other people’s photos and receive a flood of them in return. “It made me look popular,” she told the jury.
The notifications gave her a rush — she would slip away to the bathroom during school to check them. Beauty filters on Instagram — tools that let her reshape how she looked to the world. A nearly 35-foot canvas banner of her Instagram photos was unfurled in the courtroom. She said almost all of them had a filter on them.
There is a video from this period, introduced at trial, that shows a young Kaley surpassing 100 YouTube subscribers — she is crying tears of joy. Then she turns to the camera and apologizes for her appearance. “I look so fat in this shirt,” the young Kaley says.
Kaley began cutting herself. She developed body dysmorphia, anxiety, depression, suicidal thoughts. When she tried to set limits on her use, it wouldn’t work. She couldn’t stop.
Meta’s defense was direct: her problems preceded the platforms. Not one of her therapists had identified social media as the cause of her mental health struggles. The company pointed to a turbulent home life. The jury’s task, under California law, was not to find that Instagram caused Kaley’s suffering — only that it was a substantial factor in it.

The First Finding
On March 25, 2026, the jury held Meta liable for deliberately harming a child — for acting with “malice, oppression, or fraud.” They had looked at everything the company knew, everything it decided, and everything that followed — and they held it accountable.
The verdict came one day after a New Mexico jury found Meta liable for misleading consumers about the safety of its platforms and endangering children, ordering $375 million in civil penalties.
The Los Angeles award was smaller: $3 million in compensatory damages and an additional $3 million in punitive damages, with Meta bearing 70 percent of the liability and Google-owned YouTube 30. But the size of the award was not the point. The verdict was a bellwether — a test of whether juries would hold social media companies responsible for the harm their products caused. This one did. Thousands more cases are waiting.
It has prompted comparisons to another industry that once seemed untouchable: tobacco.

A Signal, Not Yet a Reckoning
The evidence is now public. Meta wasn’t just aware that its platforms were reaching younger and younger users — it was engineering that outcome, internally targeting tweens between the ages of 10 and 12 as the foundation of its long-term growth strategy. It also knew, from its own research, that teenagers were being harmed by those same platforms. And it continued optimizing for engagement anyway. With more than 10,000 similar suits still pending, KGM v. Meta may be less an ending than an opening — the first verdict in what could be a sustained legal reckoning.
But litigation is not the same as a solution. The tobacco industry paid hundreds of billions of dollars in settlements and kept selling cigarettes. What changed tobacco was not the verdicts alone — it was what the verdicts made possible: regulation, restriction, a cultural shift so complete that a cigarette in a children’s movie is now unthinkable. Whether social media follows that arc, or whether the companies find ways to absorb the legal costs and optimize on, remains an open question.
For now, what the verdict proved most of all: that these companies can be held to account in a court of law. For three decades, that accountability was effectively foreclosed — blocked by 26 words written into law in 1996 that gave platforms broad immunity from exactly this kind of suit. Into that vacuum, the industry built the architecture of addiction: infinite scroll, recommendation algorithms tuned for compulsion, beauty filters tested on teenage girls, a product strategy that explicitly targeted children before they were old enough to resist it.
Years before a jury reached that conclusion, at least one person inside Meta already had. According to court filings, an employee reacted to the company’s push to recruit underage users with a message to colleagues: “Oh good, we’re going after <13 year olds now? Zuck has been talking about that for a while… targeting 11 year olds feels like tobacco companies a couple decades ago. Like we’re seriously saying ‘we have to hook them young’ here.”
Note: Prefer to listen? Use the Article Voiceover at the top of the page, or find all narrated editions in the Listen tab at solvingfor.io.
Solving For is a deep-dive series that takes on one pressing problem at a time: what's broken, what's driving it, and what a path forward might look like.
Previous series have examined rare earth dominance, AI safety, the decline of local news, the end of amateurism in college sports, shrinking competition in Congress, and a world rearming as the global rules-based order weakens. Learn more at solvingfor.io.


