Court Filing Reveals Meta Executive Called Facebook Messenger Encryption Plan ‘Extremely Irresponsible’

Internal documents show top Meta safety officials warned that encrypting Facebook Messenger and Instagram DMs could endanger children and limit law enforcement access to abuse cases.

New Delhi, Feb 24 : Meta proceeded with its plan to implement end-to-end encryption on Facebook Messenger and Instagram direct messages despite internal warnings that the move could compromise child safety, according to newly released court filings in New Mexico.

Monika Bickert, Meta’s head of content policy, described the decision as “so irresponsible” in a March 2019 internal chat, questioning the company’s ability to flag child exploitation content if encryption was applied. The filings, part of a lawsuit filed by New Mexico Attorney General Raul Torrez, include emails, messages, and briefing documents highlighting concerns from senior safety executives at Meta.

Torrez’s lawsuit alleges the company allowed predators access to underage users, leading to real-world abuse and human trafficking. The New Mexico case is the first of its kind against Meta to reach a jury and coincides with multiple global litigations and regulatory scrutiny over the impact of its platforms on youth safety and mental health.

Internal Concerns on Child Safety
Internal documents reveal that with default encryption, Meta could have reduced its reporting of child sexual exploitation imagery to the National Center for Missing & Exploited Children from 18.4 million cases in 2018 to just 6.4 million, a 65% drop. Safety executives also warned that the company would be unable to proactively provide information on hundreds of child exploitation, sextortion, and terrorist cases.

Bickert and Global Head of Safety Antigone Davis expressed serious doubts about publicly promoting the plan. “FB allows pedophiles to find each other and kids via social graph with easy transition to Messenger,” Davis wrote in a 2019 email. The documents note that WhatsApp, which was already encrypted and not linked to a social platform, posed far lower risks.

Meta’s Response
Meta spokesperson Andy Stone stated that the concerns raised in 2019 informed the development of safety measures implemented before launching encryption on Messenger and Instagram in 2023. While messages are now encrypted by default, users can still report harmful content, which Meta can review and refer to law enforcement.

Measures include special accounts for minors to prevent adults from initiating contact, along with other safety tools designed to detect abuse in encrypted chats. Stone emphasized that these features aim to mitigate risks highlighted by executives prior to the rollout.

The filings underscore the tension between enhancing user privacy and ensuring child protection on social media platforms, highlighting ongoing debates about the trade-offs of end-to-end encryption in semi-public messaging networks.

Meta