t>

Instagram has been forced into lengthy delays to introduce youth safety features, such as nudity filters, court filings show


Prosecutors in a lawsuit about whether social media programs, such as Instagram, are addictive and harmful, wanted to know why it took so long for Meta to release security tools, such as a nude filter for private messages sent to teenagers. In April 2024, Meta be explained a feature that automatically prevents images from appearing in Instagram DMs – something the company says was a problem about six years ago.

In a statement recently released in federal court, the head of Instagram Adam Mosseri was questioned about the August 2018 email by Meta VP and Chief Information Security Officer. Guy Rosenwhere he said that “terrible” things can happen through private Instagram messages, also known as DMs. The indecent material could include pictures of dicks, the plaintiff’s attorney said, and Mosseri agreed.

However, Meta’s CEO pushed back in a line of inquiry that said the company should have notified parents that its messaging system was not monitored, beyond removing CSAM (Child Sexual Abuse Material).

“I think it’s obvious that you can send sensitive messages in any messaging app, whether it’s Instagram or not,” Mosseri said. He said the company tried to balance public interest in privacy with its interests in security.

The evidence also revealed new statistics about negative experiences on Instagram, revealing that 19.2% of respondents aged 13 to 15 said they had seen nude or sexual images on Instagram that they did not want to see. In addition, 8.4% of children between the ages of 13 and 15 said they had seen someone harm themselves or threaten to do so on Instagram in the last seven days they used the app.

Although the nudity filter is one of several changes that have been added to Instagram in recent years to protect young people, critics were more interested in its delay in taking action, not if the program is safe for young people now.

Mosseri was also interviewed on other topics, such as an email from a Facebook intern in 2017, who said he wanted to find “addicted” Facebook users and see if there were ways to help them.

Techcrunch event

Boston, MA
| |
June 9, 2026

The email chain of 2018 was designed to be one example that Meta knew about the danger to children, but it took the company until 2024 to release something that solved the problem of sexual images sent to young people. This includes photos posted by adults who may be grooming themselves, a process in which an adult builds trust with a minor over time in order to distract or take advantage of them.

When asked to comment, Meta spokeswoman Liza Crenshaw outlined some of the ways the company has worked to protect young people over the years, saying, “For more than a decade, we have been listening to parents, working with experts and law enforcement, and conducting extensive research to understand what is most important. The experiences of young people. We are proud to go where we have been, and we are always striving to improve,” she said.

The installation provided by Mosseri took place at the same time as the current one several cases looking to have a major professional role in harming young people. This especiallywhich is taking place in the US District Court in the Northern District of California, includes plaintiffs who say that social networks are problematic because they are designed to increase screen time, which encourages juvenile behavior among young people. The challengers include Meta, Snap, TikTok, and YouTube (Google).

Similar cases are also happening in Los Angeles County Superior Court and in New Mexico.

Lawyers in both cases hope to prove that big tech companies prioritize the importance of user growth and increase attention to issues that may affect younger users.

The timing of the trial comes amid a flurry of laws restricting youth use of social media, in several US states and abroad.

Updated after publication with Meta comment.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *