SANTA FE, N.M. (RTW News) — Closing arguments commenced Monday in a pivotal trial in New Mexico, where social media giant Meta is charged with misleading users regarding the safety of its platforms for children.
After six weeks of courtroom testimony from an array of witnesses, including educators, mental health experts, investigators, and former Meta employees, jurors are expected to deliberate the case today.
Taking place in New Mexico state court, this trial is among the first to test the waters of litigation against social media companies and their influence on minors.
Prosecutors from New Mexico claim that Meta, which oversees popular apps like Instagram and Facebook, prioritized profits over child safety, violating state consumer protection laws. Concerns have centered around the safety implications of the platform's complex algorithms and its various messaging features.
“Young people are spending excessive time on Meta’s platforms, losing control of their engagement,” claimed prosecution attorney Linda Singer in her closing argument. “Meta was aware of this situation yet did not communicate it adequately.”
Moreover, Singer detailed evidence during the trial that suggested Meta's algorithms were directing teenagers towards sensational and potentially harmful content, as well as failing to enforce a minimum age requirement of 13 for users.
“The issues surrounding safety in this case were not mere oversights,” Singer emphasized. “They reflect a corporate approach that placed growth above the safety of children. The impact has been detrimental to youth in our state and beyond.”
Singer asked jurors to impose a civil penalty exceeding $2 billion against Meta, appraising a maximum $5,000 fine for each of the two counts of consumer protection violations, expanded by an estimate of 208,700 monthly users under 18 in New Mexico that might qualify for harm.
Attorney General Raúl Torrez initiated the lawsuit against Meta in 2023, arguing that the company set up a virtual environment that attracted predators targeting minors and failed to reveal the risks associated with their platforms.
Meta’s attorneys assert the company does implement protections for young users and works to eliminate harmful content, while admitting that some content may occasionally evade their safety measures.
The courtroom debates also spotlighted Meta executives' claims about safety improvements and how they combat addiction without infringing on free speech, which clash with findings from their internal studies regarding user addiction issues.
In a parallel trial in California, a jury is currently deliberating on whether Meta and YouTube bear responsibility for the harms inflicted on children through their platforms—a decision that could have far-reaching implications for numerous similar cases across the country.
This trial can influence how social media firms are held accountable when their algorithms contribute to risks faced by children online, ultimately compelling tech companies to exercise greater caution in their operations.





















