Newly unsealed court documents say Meta knew as far back as 2018 that adults were able to find and message minors on Instagram, including sending explicit images, yet Instagram only rolled out automatic blurring of sexually explicit images in teen direct messages in September 2024. The details come from a deposition involving Instagram head Adam Mosseri and an internal email thread describing everything from widespread harassment to severe harm, filed in a large multi family lawsuit in federal court. The filing also reveals internal survey results showing nearly one in five 13 to 15 year olds reported unwanted nudity or sexual imagery, and it references an internal estimate of about 200,000 daily child users facing inappropriate interactions that was never shared publicly. Meta argues it introduced other protections in the interim, but critics say safety changes often lagged behind engagement and growth goals.

Recent news