X
Business

Is social media safe for kids? Surgeon general calls for a warning label

Should apps like Instagram and TikTok come with a warning label for teens? Here's what you need to know.
Written by Radhika Rajkumar, Editor
gettyimages-1213627011
Elva Etienne/Getty

You've seen them before: mandated labels on cigarette packs and alcoholic beverages that warn of the health risks they can cause, including lung cancer and fetal injury. Now US Surgeon General Dr. Vivek Murthy wants to add similar labels to social media

In a New York Times op-ed on Monday, Murthy called for Congress to place health warnings on social media apps. Citing several studies, Murthy said "social media has emerged as an important contributor" to an ongoing youth mental health crisis in the US.

Also: This social network bans all AI images

Amid a broader national loneliness epidemic, research shows teen mental health has been increasingly suffering during the past decade, especially in the wake of the COVID-19 pandemic. According to an April 2024 study, "adolescents ages 12-17 have experienced the highest year-over-year (YoY) increase in having a major depressive episode (MDE) since 2010." Suicide and suicidal ideation among teens are also up, according to data from the Centers for Disease Control and Prevention (CDC).

"A surgeon general's warning label, which requires congressional action, would regularly remind parents and adolescents that social media has not been proved safe," Murthy said in the op-ed. He has previously stated that 13, the common age minimum for many apps like Twitter and Instagram, is too young for social media exposure, and released guidance about the impact of social media on adolescents last year.

Also: The best parental control apps of 2024

However, Congress would have to pass a bill for social media apps to come with a health warning -- and none appear to be on the docket just yet. Besides, Murthy acknowledged that labels alone wouldn't sufficiently address the negative effects of these apps. But his urgency raises the long-debated question: is social media safe for kids? 

Mental health and social media 

The question of whether social media is safe is tricky to answer. While some study findings suggest risks, others are less conclusive. 

2019 study, which Murthy cited in his call to action, found that "adolescents who spend more than three hours per day on social media may be at heightened risk for mental health problems, particularly internalizing problems." According to Gallup, the national average among teens is almost five hours.

Also: Instagram launches two new features to help you fight back against bullying

Research from the American Psychological Association (APA) found that reducing social media use significantly improved body image among teens and young adults in just a few weeks.

However, a 2023 study states that while there's a general correlation between depression and social media use among teens, "certain outcomes have been inconsistent (such as the association between time spent on social media and mental health issues), and the data quality is frequently poor."

"Browsing social media could increase your risk of self-harm, loneliness, and empathy loss," the report continued, but "other studies either concluded that there is no harm or that some people, such as those who are socially isolated or marginalized, may benefit from using social media."

These findings acknowledge that social media can create positive outcomes for teens by providing access to communities they might not otherwise have, especially for LGBTQ+ youth and other marginalized groups.

Also: How to get rid of My AI on Snapchat for good

The surgeon general also noted in his criticism that there isn't enough research to suggest social media use is safe for kids -- an observation that's worth considering among varied data. His May 2023 advisory explores both positive and negative outcomes of social media use and encourages parents to create tech-free zones where possible. 

Beyond social media, several factors likely contribute to the decline in youth mental health, including economic conditions. Some researchers theorize that reporting of depressive symptoms has increased, due to how mental health discussions proliferate online. Overall, it's hard to say how much of the problem social media is responsible for. 

Murky motives

Tech companies have admitted the gamification elements of social media can be addictive. Social media apps are free to download because users are the product: the companies behind them sell user data and ad space to advertisers. This means tech companies are incentivized to prolong the time users spend scrolling. 

In the 2020 Netflix documentary The Social Dilemma, several former employees of companies like Meta and Google -- including Tristan Harris, co-founder of the Center for Humane Technology -- explain the intentionally gamified nature of social media apps. Essentially, these apps are designed to create reward patterns in the brain to keep users on or returning to the platform as much as possible. 

Also: Anxiety-free social media? Maven thinks it has a formula for it

"For too long, we have placed the entire burden of managing social media on the shoulders of parents and kids, despite the fact that these platforms are designed by some of the most talented engineers and designers in the world to maximize the amount of time that our kids spend on them," Murthy told CNN in May 2023. 

Current and former employees, including leadership like Instagram CEO Adam Mosseri, have come before multiple Senate hearings to testify on the impacts of their technology. A 2021 hearing revealed that Facebook had internal data showing its algorithms negatively impacted users' mental health, but failed to act on this evidence. 

Regardless of whether Congress moves to add warning labels to social apps, it's safe to say that some level of intervention could help hold tech companies accountable for their products -- similarly to how other industries are regulated for consumer protection. 

Ongoing legislative approaches 

Even without action at the federal level, state governments across the US are moving to limit adolescent access to social media.

In October, DC Attorney General Brian L. Schwalb sued Meta alongside 42 other attorneys comprising a bipartisan coalition. The suit argues that Meta "knowingly designed Instagram and its other social media platforms with features that lure in and addict children and cause harm to their mental, emotional, and physical health," and falsely assuring the public of these apps' safety, despite internal research. 

Also: How this law protects your thoughts from tech companies

New York is close to enacting The Stop Addictive Feeds Exploitation (SAFE) for Kids Act, which would restrict the use of algorithms in feeds for children under 18 without parental consent, instead organizing them chronologically. It has been passed in both the Senate and Assembly, and Governor Kathy Hochul is expected to sign it once it is finalized. 

Florida Governor Ron DeSantis signed a bill in March preventing children under 14 from opening their own social media accounts and requiring anyone under 16 to get parental consent to open an account. The bill goes into effect on January 1, 2025. 

Moves to limit the reach of social apps aren't limited to the US. Just last month, the EU launched an investigation into whether Meta deliberately makes its apps addictive, prompted by growing concerns about teen use. 

But for now, it's unclear whether the surgeon general's recommendations will result in new health-related guidance.

Editorial standards