New Mexico case puts internal decisions and platform design under the microscope
Meta is under intensifying legal and political pressure as a high-stakes trial in New Mexico brings renewed attention to how Facebook and Instagram handled risks involving children. The case has moved beyond general criticism of social media and into a more damaging phase, with prosecutors using internal company documents, former employee testimony and evidence from child-safety organizations to argue that Meta failed to act quickly enough on known harms while continuing to prioritize growth and engagement.
The proceedings have already exposed a wide range of allegations, from child exploitation and trafficking risks to claims that Meta’s products were designed in ways that could worsen compulsive use, body image distress and exposure to self-harm material. Meta has strongly rejected those accusations, describing them as sensational and misleading, while insisting that it has invested heavily in safety tools and cannot prevent every crime or harm across platforms used by billions of people.
Even so, the trial is becoming a major test of whether the company can still defend its long-running claim that its platforms are safer than the alternatives available to young users. It also raises a more existential business question: whether regulators, courts and lawmakers will allow Meta to keep relying on younger users for future growth if juries conclude that the company did not adequately protect the teens and children already on its services.
Internal records have sharpened scrutiny of exploitation risks
One of the most damaging parts of the case has centered on internal communications presented in court. Prosecutors highlighted messages warning senior executives about severe safety issues, including one email stating that Instagram had become a leading two-sided marketplace for human trafficking. They also presented claims that Meta struggled to detect and report child sexual abuse material, trafficking activity and grooming behavior with the speed and precision required for effective intervention.
The state’s case leans heavily on an undercover investigation known as Operation MetaPhile, in which agents posing as girls under 13 were allegedly contacted by suspects who sought sexual encounters after finding minors through Meta’s platforms. According to the case, some of the accounts drew enormous attention, including thousands of followers and hundreds of daily requests, without being shut down. Investigators said the company even sent one such account advice on monetization and audience growth rather than treating it as a likely child-safety threat.
Testimony from former employees has further deepened the pressure. One ex-executive told the court he did not believe safety was treated as a true priority inside the company. Another former insider said the platform’s recommendation systems were highly effective at connecting predators with minors and that senior leadership already understood the problem but did not act decisively enough.
Encryption and reporting failures became a second line of attack
Another major theme in the trial has been Meta’s move to encrypt Facebook Messenger. Child-safety groups and law enforcement witnesses argued that end-to-end encryption sharply reduced visibility into abusive interactions and removed access to evidence that had previously helped identify exploitation and trafficking. Prosecutors said the decision hindered investigations, while the National Center for Missing and Exploited Children warned that reporting volumes fell sharply after encryption was introduced.
The jury also heard evidence that Meta’s reporting system had long suffered from serious operational weaknesses. Between 2017 and 2021, the company allegedly accumulated a backlog of 247,000 cyber tip reports involving potential child harms, with some reports delayed for weeks or months. Other reports were said to have been misclassified as low priority, reducing their value for law enforcement and potentially allowing perpetrators to avoid timely scrutiny.
That issue goes to the heart of the case because it suggests that the problem was not only what users did on Meta’s platforms, but also whether the company’s internal systems were capable of responding to danger with enough speed and clarity. Prosecutors argue that poor-quality tips, delayed submissions and unreviewed automated reports created a system that looked active on paper but often failed in practice when time-sensitive intervention was most needed.
Mental health evidence broadens the stakes beyond exploitation
The New Mexico proceedings have also overlapped with wider allegations being litigated in Los Angeles, where plaintiffs claim Meta knowingly built products that encouraged addictive behavior and exposed minors to damaging content. Evidence introduced in court included internal presentations about attracting young users, documents suggesting the company closely tracked pre-teen engagement despite a formal age cutoff of 13, and warnings from staff about the risks of beauty filters and body image features.
Some of the most emotional testimony came from parents and advocates who linked Instagram content to serious mental health consequences for teenagers. The argument is not just that harmful material existed, but that platform design, recommendation systems and visual tools may have amplified vulnerabilities in young users rather than reducing them. Prosecutors also alleged that Meta allowed minors to interact with AI companions despite internal concerns, and that advertising revenue could appear alongside material sexualizing children.
Meta’s defense has focused on scale, investment and product safeguards. The company says it removes huge amounts of exploitative material proactively, places teenagers into stricter default settings and continues to build protections designed specifically for younger users. But the trial has made clear that this is no longer only a dispute over policy language. It is a battle over whether executives knew enough, early enough, to act differently.
The eventual verdict will matter well beyond New Mexico. If jurors find liability tied to exploitation, trafficking or intentional addiction, the consequences could extend into legislation, age-gating rules and further restrictions on how Meta reaches younger audiences. For a company that still depends on attracting the next generation of users, the case could become more than a courtroom setback. It could become a turning point in how its platforms are regulated and how much public trust they retain.

