New York City Mayor Eric Adams (D) on Wednesday announced a lawsuit targeting five major social media platforms, holding the companies that host the platforms responsible for “fueling a national youth mental health crisis.” he claimed.
The lawsuit, filed in California by New York City, seeks accountability from the companies behind five major platforms: Meta's Facebook and Instagram, Snap's Snapchat, ByteDance's TikTok, and Google's YouTube.
“Over the past decade, we have seen how addictive and overwhelming the online world has become, exposing children to a constant stream of harmful content and increasing the number of adolescents nationwide,” Adams said in a press release. It is fueling a mental health crisis,” he said in a press release.
“Today, on behalf of millions of New Yorkers, we are taking bold action to hold these companies accountable for their role in this crisis and step up efforts to address this public health danger. “This lawsuit and action plan are part of a larger reckoning that will shape the lives of young people, cities and societies for years to come,” Adams added.
The purpose of the lawsuit is to “force tech giants to change their behavior and recover the costs of addressing this public health threat,” according to an official press release. The press release said the city spends more than $100 million each year on mental health programs for youth.
The 305-page complaint makes several lofty allegations against all defendants before detailing allegations against individual companies.
The lawsuit alleges that the defendants “caused” a mental health crisis among young people and that the companies “could have avoided causing harm to plaintiffs in New York City.”
The defendants specifically target school-age children as a “core market,” while “millions of children” are “forced” to use the social media platform, even during school hours. He pointed out that this is being done.
The plaintiffs also claim in the lawsuit that social media platforms are “designed, developed, produced, operated, promoted, distributed, and marketed to attract, capture, and addict youth with minimal parental supervision.” “It was,” he claims.
The lawsuit comes amid a flurry of lawsuits holding social media platforms accountable for contributing to an increase in mental health problems among children and teens.
In response to the lawsuit, representatives of the social media companies denied most of the suit's allegations and emphasized the safety and privacy features their companies have developed in recent years.
“Providing young people with safer and healthier experiences online has always been at the core of our work,” Google spokesperson Jose Castañeda said in a statement reported by Axios. Ta. “We have worked with youth, mental health and parenting experts to build services and policies that provide age-appropriate experiences for young people and robust controls for parents. The allegations are simply not true.”
In a widely reported statement, a spokesperson for Mehta said: “We want to provide teens with a safe and age-appropriate online experience, and we have more than 30 tools and features to support them and their parents… has spent a decade tackling these issues, and we employ people who have dedicated their careers to keeping young people safe and supported online.”
“Snapchat was intentionally designed to be different from traditional social media, with a focus on helping Snapchatters communicate with their closest friends,” a Snapchat spokesperson said in a statement. Rather than a feed of content that encourages passive scrolling, Snapchat gives you direct access to your camera, and there are no traditional public likes or comments. We always have so much work to do, and we love the role Snapchat plays in helping our closest friends feel connected, happy, and prepared as they face the many challenges of adolescence. I'm satisfied. ”
A TikTok spokesperson said, “TikTok has industry-leading safety features to support the health of teens, including age-restricted features, parental controls, and an automatic 60-minute time limit for users under 18.” There is a plan,” he said. We regularly partner with experts to understand emerging best practices and continue to work to keep our communities safe by addressing industry-wide challenges. ”
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.