t>

Meta was eventually charged with injuring the youth. Now what?


Meta lost a lawsuit against the state of New Mexico last week, writing the first time that the company has been accused by the court of endangering the safety of children. This was a special decision in itself – but the next day, Meta lost another case when the judges in Los Angeles found that the company made its programs intentionally to be confusing to children and teenagers, thus endangering the mental health of the plaintiff, 20 years old called KGM.

These examples have opened the door to many lawsuits related to Meta’s deliberate targeting of young users, even though it knows that its programs may have negative connotations for young people. Thousands of cases as KGM is pending, where 40 state attorneys have filed similar lawsuits against Meta and New Mexico.

Even social networks are legally protected so that they cannot be blamed for what users write on their platforms, this time, it was not the content of these platforms that was tested. It was just the look of it, like the endless scrolls and all the clock notifications.

“They took the example that was used against the tobacco industry many years ago, and instead of looking at things like the content, they focused on these additional things – how the platform is built, and what is built, which is different from the content, when you have a First Amendment dispute,” Allison Fitzpatrick, a digital lawyer and partner at Davis + Gilbert, told TechCrunch. “In both cases, it turned out to be a success.”

A jury in the New Mexico case, after a six-week trial, found Meta guilty of violating the state’s Unfair Practices Act, ordering the company to pay $5,000 in damages for the violation, totaling $375 million. The Los Angeles lawsuit, which found Meta 70% responsible and YouTube 30% responsible for KGM’s complaint, will pay the companies $6 million. (Snap and TikTok dismissed the case before the trial.)

“This is nothing to the Metas of the world,” Fitzpatrick said. “But if you take $6 million and multiply it by all the charges they have, that’s a lot.”

“We respectfully disagree with these rulings and will appeal,” a Meta spokesperson told TechCrunch. “Reducing something as critical as youth health to a single threat leaves many of the challenges facing young people today unaddressed and ignores the fact that many young people rely on digital networks to connect and find friends.”

Techcrunch event

San Francisco, CA
| |
October 13-15, 2026

During the proceedings, new internal documents from Meta were revealed, which show the lack of action regarding its platforms known to be harmful to children, and a constant experiment to increase the amount of time young people spend on its programs, whether they are at school or through “finstas,” which are “fake Instagram” accounts that young people create mainly to hide from parents or teachers.

One document showed report and the results of a study from 2019, where Meta conducted 24 interviews, one-on-one with people whose use of the drug was said to be problematic – a name that applies to about 12.5% ​​of users.

“The best external research shows that Facebook’s impact on people’s lives is negative,” the report says.

More information documents He also cited statements from Meta CEO Mark Zuckerberg and Instagram CEO Adam Moseri about prioritizing youth engagement. Zuckerberg though comments that Facebook Live is doing well with young people, “the idea is that we need to be more efficient without letting parents/teachers know.”

In other posts, Meta employees openly talked about the company’s goals in increasing the retention of young people.

“We’ve learned one of the things we need to improve is avoiding looking at your phone in the middle of Chemistry :),” one employee wrote email to Meta CPO Chris Cox.

“No one wakes up thinking they want to increase the number of times they open Instagram that day,” Meta’s VP of Product Max Eulenstein wrote in an internal email in January 2021. “But that’s what our sales teams are trying to do.”

A spokesperson for Meta told TechCrunch that many of the documents were only released about 10 years ago, but the company is listening to parents, experts, and regulators about how the platform can operate.

“We don’t want to have a young age these days,” the spokesperson said, referring to Instagram’s Youth Account, which was launched in 2024, which provides permanent protection for young users. This protection includes defaulting to private accounts and allowing targeted users to tag or mention them in posts. Instagram will also send reminders to tell young people to leave the app after 60 minutes, which can be changed for children under 16 with parental consent.

For Kelly Stonelake, Director of Product Marketing at Meta who worked for the company from 2009 to 2024, these revelations are not surprising. (Stonelake is here against Meta because of discrimination based on gender and violence.)

“A lot of unpublished evidence reflects my experience,” he told TechCrunch.

At Meta, Stonelake led the “go-to-market” process for Horizon Worlds’ VR app as it rolled out to young audiences. He says he raised concerns about the lack of control tools in the metaverse, but his objections were not seriously addressed.

The US government has become very interested in the issue of children’s online safety, especially after Meta whistler Frances Haugen released internal documents in 2021 that showed Meta knew that Instagram was harming teenage girls.

Although Congress has passed significant funding to address children’s online safety, many of these provisions may do more to monitor adults and censor speech than to protect children, some say. privacy activists say.

“There is no universe beyond observation or ‘age verification‘ The law, under the guise of protecting children, does not allow for massive online censorship that Trump does not like,” Fight for the Future director Evan Greer said in a statement.

Stonelake once attracted visitors to Capitol Hill Kids Online Safety Actwhich has been a major threat to these legal initiatives, gaining support from companies such as Microsoft, Snap, X, and Apple. But as the bill evolved and changed, he opposed it.

“I recommend a ‘no’ vote on the current version,” he said, referring to the provisions of the bill, which would conflict with government regulations on the technology industry. “There is very recent language that would close the doors of the courts to school districts, to bereaved families, to states — and that’s sad.”

This language could, for example, negate the lawsuit that New Mexico brought against Meta.

“We need people to come to the table with solutions, instead of what they’re doing now, which is just telling a different story on both sides of the aisle to frustrate and scare them,” Stonelake said. “The real solution has to be complex and flexible and take into account a number of priorities.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *