New Mexico Jury Orders Meta to Pay $375 Million Over Child Exploitation, User-Safety Claims

The verdict is the first jury decision on these types of claims against Meta, which is battling a wave of litigation over the effects of its platforms on young people’s mental health.

A jury in New Mexico has ruled that Meta Platforms broke state law, concluding the tech giant misled users about safety on Facebook, Instagram and WhatsApp and enabled child sexual exploitation on its services, according to the state attorney general’s lawsuit.

After less than a day of deliberations, jurors determined Meta violated New Mexico’s consumer protection statute and ordered the company to pay $375 million in civil penalties.

- Advertisement -

The verdict is the first jury decision on these types of claims against Meta, which is battling a wave of litigation over the effects of its platforms on young people’s mental health.

“We respectfully disagree with the verdict and will appeal,” a Meta spokesperson said in a statement. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content.”

“The substantial damages the jury ordered Meta to pay should send a clear message to big tech executives that no company is beyond the reach of the law,” he said.

In a second phase scheduled for May, Mr. Torrez said his office will request court-ordered changes to Meta’s platforms to protect children and seek additional financial penalties.

A Meta spokesperson said the company disagreed with the verdict and intended to appeal (Stock image)

State lawyers had sought more than $2 billion in damages.

The jury’s ruling concluded a six-week trial in Santa Fe. Attorney General Hector Balderas’ successor, Raúl Torrez, accused Meta of permitting predators to access underage users and of connecting them with victims—conduct he said led to real-world abuse and human trafficking.

“Over the course of a decade, Meta has failed over and over again to act honestly and transparently,” Linda Singer, counsel for the state, told jurors during closing arguments. “It’s failed to act to protect young people in this state.”

Meta denied those allegations and pointed to what it described as extensive safeguards for younger users. “What the evidence shows is Meta’s robust disclosures and tireless efforts to prevent harmful content. And these disclosures mean that Meta did not knowingly and intentionally lie to the public,” Kevin Huff, an attorney for Meta, told the jury on Monday.

Reuters viewed the trial on Courtroom View Network.

Scrutiny of Meta’s handling of child and teen safety has intensified in recent years, amplified by 2021 whistleblower testimony to Congress alleging the company knew its products could be harmful yet did not act.

Separately, Meta faces thousands of lawsuits claiming it and other social platforms deliberately engineered products that hook young users and contributed to a nationwide youth mental health crisis.

Some of those cases, filed in state and federal courts, seek damages in the tens of billions, according to Meta’s regulatory filings.

A state court jury in Los Angeles is currently deliberating in the first trial over the addiction-related claims. Meta has argued it is protected from liability in both the addiction suits and the New Mexico case by the First Amendment and Section 230 of the Communications Decency Act, which generally shields websites from claims based on third-party content.

The company has maintained the harms alleged by the state are inseparable from the content on its platforms because algorithms and design choices deliver that content.

The New Mexico judge rejected Meta’s Section 230 defenses, allowing the matter to proceed to trial.

Undercover operation

The lawsuit in New Mexico grew from a 2023 undercover operation led by Mr. Torrez, a former prosecutor. Investigators created Facebook and Instagram accounts that posed as users under 14.

Those accounts, the state says, received sexually explicit material and were contacted by adults seeking similar content, which led to criminal charges against multiple individuals, according to the attorney general’s office.

New Mexico contends Meta promoted Instagram, Facebook and WhatsApp as safe for the state’s children and teenagers while concealing the scale of dangerous content present on its platforms.

State lawyers point to internal documents they say admit problems with sexual exploitation and mental health harm but show the company did not implement fundamental protections such as effective age verification, even as it publicly insisted the services were safe.

The complaint also alleges Meta designed features to maximise engagement—such as infinite scroll and auto-play videos—despite evidence these mechanics worsened addictive behaviour among children and contributed to depression, anxiety and self-harm.

On Tuesday, the jury concluded Meta knowingly engaged in an unfair or deceptive trade practice in violation of the state’s consumer protection law. Jurors also found the company’s conduct unconscionable, determining Meta took advantage of New Mexico residents’ lack of knowledge.

The jury identified 75,000 violations and awarded $5,000 for each violation.

In May, Judge Bryan Biedscheid will preside over a bench trial on the state’s separate claim that Meta created a public nuisance that harmed residents’ health and safety. The state has said it will ask the judge to require Meta to implement measures such as effective age verification and to remove predators from its platforms.