At a Glance
- Meta faces its first state-level trial over child-safety claims.
- New Mexico Attorney General Raúl Torrez accuses the company of failing to block sexual content for minors.
- Meta seeks to exclude research, news stories, financial data, past privacy violations, and even CEO Mark Zuckerberg’s college history from evidence.
Why it matters: The case could set a precedent for how tech companies handle child-protection lawsuits in the United States.
Meta is scheduled to go on trial in New Mexico on February 2. The lawsuit, filed in late 2023, accuses the company of failing to protect minors from online predators, trafficking, and sexual abuse on its platforms. The state claims that Meta allowed explicit material to reach minors and did not implement adequate child-safety measures.
The lawsuit is the first trial of its kind at the state level. It marks a significant moment for both the company and regulators, as it could influence how social-media platforms address child-protection policies nationwide.
The Lawsuit
The complaint was filed by New Mexico Attorney General Raúl Torrez. It focuses on three core allegations:
- Failure to block sexual content that can be accessed by minors.
- Inadequate child-safety measures that allow predators to target young users.
- Negligence in preventing sexual exploitation on the platform.
These points are supported by evidence that the company did not act quickly enough to remove or restrict such content. The state argues that the lack of robust safeguards creates an environment where minors are at risk.
Meta’s Legal Strategy
In preparing for the trial, Meta‘s lawyers are attempting to narrow the scope of evidence that can be presented. Public records reviewed by Wired show that the company wants to block:
- Research on social media’s impact on youth mental health.
- Stories about teen suicides linked to social media.
- Any mention of Meta‘s finances.
- The company’s past privacy violations.
- Details about CEO Mark Zuckerberg’s college years.
- Mentions of the company’s AI chatbots.
- A public health warning issued by former U.S. Surgeon General Vivek Murthy about social media’s effect on youth mental health.
- Surveys, including its own, about the amount of inappropriate content on its platforms.
The legal team argues that these pieces of information are either irrelevant or could unfairly sway the jury. This broad request to exclude information has drawn criticism from two legal experts who spoke to Wired. They noted that the scope of the request is unusually extensive for a case of this nature. The experts also warned that a court might view the request as an attempt to suppress evidence that could be vital to the state’s argument.

Key Allegations
| Allegation | Evidence Mentioned |
|---|---|
| Explicit material reached minors | Complaint cites lack of filtering |
| Predators targeting young users | Complaint cites inadequate safeguards |
| Sexual exploitation prevented | Complaint cites negligence |
The state’s complaint also highlights that Meta‘s policies have not kept pace with the evolving threat landscape. The allegations suggest that the platform’s child-safety measures are insufficient and that the company has not taken proactive steps to mitigate risk.
Court’s Upcoming Trial
| Date | Event |
|---|---|
| Late 2023 | Lawsuit filed by Raúl Torrez |
| February 2 | Trial scheduled to begin in New Mexico |
The trial will be the first of its kind at the state level, making the outcome closely watched by both the tech industry and regulators. If the court sides with the state, it could compel Meta to overhaul its child-safety policies and potentially lead to increased regulatory scrutiny.
Potential Outcomes
- Victory for the state could force Meta to adopt stricter safeguards and face additional lawsuits in other jurisdictions.
- Victory for the company might limit the scope of evidence but could still require some policy adjustments if the court finds any liability.
- Mixed outcome could result in a settlement that includes new safety measures without a full admission of wrongdoing.
What This Means for Meta
- Reputational risk: The allegations could damage the company’s public image, especially among parents and advocacy groups.
- Operational impact: A ruling requiring stricter child-safety measures could necessitate significant changes to the platform’s design and moderation systems.
- Legal precedent: A decision against Meta could open the door for similar lawsuits in other states, potentially leading to a patchwork of regulations.
- Regulatory scrutiny: The case may prompt state regulators to pursue further investigations into Meta’s safety practices.
The company’s attempt to block a wide range of evidence is unprecedented. If successful, it may set a new standard for how tech firms defend themselves in child-protection cases. However, the legal experts who reviewed the filings believe that the request is too broad and could backfire if the court finds the evidence essential to the case.
Background on Child Safety in Social Media
The lawsuit reflects a growing national focus on protecting children online. State attorneys general have increasingly targeted large platforms for alleged lapses in safety protocols. The case underscores the pressure on companies to demonstrate that they are actively preventing sexual content from reaching minors.
Legal Experts’ Concerns
The two experts who discussed Meta’s requests with Wired expressed worry that the company is trying to shield itself from scrutiny. They pointed out that excluding research on youth mental health and public health warnings could undermine the court’s ability to assess the broader impact of social media on young users.
Potential Impact on Users
If the court finds Meta liable, the company may be required to implement stricter age verification and content moderation tools. Users could see changes to how the platform displays or filters content, potentially affecting the overall user experience.
Industry Reactions
While no official statements have been released by Meta, analysts suggest that the trial will prompt other platforms to review their child-safety protocols. The case also underscores the growing scrutiny of social-media companies by state attorneys general across the country.
Takeaways
- Meta is facing a historic state-level trial over child-safety allegations.
- The company seeks to exclude a wide range of evidence, from research to financial data.
- The outcome could influence how social-media platforms address child-protection nationwide.
- The trial, set for February 2, will be closely monitored by the industry and regulators alike.

