Meta-survey found 19% of teens on Instagram have seen unwanted nude images

The Instagram app icon is seen on a smartphone in this illustration taken October 27, 2025. — Reuters

Nearly one in five users ages 13 to 15 told Meta they saw “nudity or sexual images on Instagram” that they didn’t want to see, according to a court filing.

The document, made public Friday as part of a federal lawsuit in California and reviewed by Reutersincludes portions of a March 2025 deposition from Instagram executive Adam Mosseri.

Mosseri said the company does not share survey results “in general,” adding that self-reported surveys are “notoriously problematic,” according to the deposition.

Meta, owner of Facebook and Instagram, faces allegations from world leaders that the company’s products harm young users.

Across the United States, thousands of lawsuits in federal and state courts accuse the company of designing addictive products and fueling a mental health crisis among minors.

The statistics on explicit images come from a survey of Instagram users about their experiences on the platform, Meta spokesperson Andy Stone said, not from a review of the posts themselves.

By the end of 2025, the company announced it would remove images and videos “containing nudity or explicit sexual activity, including when generated by AI,” with exceptions considered for medical and educational content.

About 8 percent of users ages 13 to 15 also said they had “seen someone harm themselves or threaten to harm themselves on Instagram,” according to the filing.

Most of the sexually explicit images were sent via private messages between users, Mosseri said in his deposition, and Meta must consider user privacy when reviewing them.

“A lot of people don’t want us to read their messages,” he said.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top