By KRISTA GONZALESAlbuquerque
As a parent in New Mexico, I’ve seen how phones and tablets have become an everyday part of our children’s lives. Whether they’re messaging friends, doing homework, or scrolling through social media, screens are everywhere.
But the dark side of technology has led to horrific outcomes. Facebook, for example, was recently caught using pornography to train its new AI large language models, products that kids are turning to more and more. It’s just the latest example of leaders at Facebook knowingly designing addictive and dangerous products that they know harm children.
Facebook, of course, isn’t alone. TikTok has faced frequent criticism for its shady data collection practices, and apps like Snapchat have repeatedly failed to address serious safety concerns that put kids and teens at risk.
These developments have, unfortunately, led to a rise in mental illness, disordered eating, and suicide in more and more kids throughout the country.
Unfortunately, the story is no different here in New Mexico. Nearly 20% of our children are struggling with mental health challenges. Many experts have linked this growing crisis to the rise in social media use. Children and teens are spending more time online, and platforms like Facebook and TikTok are doing little to protect them. Internet addiction is also on the rise, yet tech companies continue to prioritize profits over safety.
To their credit, lawmakers are starting to address this growing crisis. In Congress, legislation is being considered to make the internet safer for young people. Here in New Mexico, Attorney General Raúl Torrez has joined dozens of his counterparts in suing Facebook for targeting children with addictive features. More than a dozen states and the District of Columbia have filed a similar case against TikTok.
But even as these efforts move forward, social media companies are pushing back with their own proposals. Their goal is not to solve the problem, but to shift the blame.
One example is Facebook-owned company, Instagram, and its “Teen Accounts” initiative that launched last fall. The company says it will use age verification to block minors from seeing harmful content. Consumer rights leaders have already pointed out that Facebook will still be able to profit from its most predatory behaviors. The setup Facebook would like to convince you is protecting teens, still allows the company to collect children’s sensitive data and target teens with deceptive advertisements. In short, Facebook will get to keep doing what they are doing and rake in the profits, all while our children suffer.
Now, Facebook is onto its next scheme. They’re pushing a proposal in capitals across the country that would require app stores to verify the ages of users before allowing them to download certain apps. At first glance, this might sound like a step forward. In reality, it would do little to protect children and much to help tech companies avoid accountability.
Facebook’s preferred legislative solution is to set up an age verification checkpoint when users go to download an app. For Facebook, this proposal is a win-win. The proposal includes convenient carveouts letting children continue to access dangerous platforms simply by using a web browser, smart TV, or gaming console. At the same time, it would shift responsibility away from social media companies and onto other companies, giving Facebook legal protection while it continues profiting from young users. It would also force app stores to share more personal information with the same companies that have repeatedly mishandled user data.
Facebook has spent millions of dollars on advertising and lobbying lawmakers to adopt their solution. Thankfully, state legislators here have rejected their proposal so far.
However, social media companies need to be regulated. Facebook, TikTok, and other large social media have shown time and time again that they’re unwilling to take meaningful steps to protect children on their platforms. PR stunts like Teen Accounts or ineffective age gates won’t protect children. But instead of listening to Big Tech about how to make social media safer, lawmakers should step up and hold these companies accountable.
The experience children have on social media platforms must be subject to clear and enforceable regulation to ensure their safety and well-being. For too long, companies have prioritized engagement and profit over child protection, designing algorithms and features that expose young users to harmful content, addictive patterns, and predatory behavior. Regulation should require platforms to adopt age-appropriate design standards, limit exploitative data collection, and provide parents with meaningful tools for oversight and control. Just as we hold toy manufacturers and broadcasters accountable for children’s safety, tech companies must be legally obligated to create safer, more responsible digital environments for young users.
Families in New Mexico deserve real protection. It’s time for our elected officials to stand up for us and reject the priorities of tech behemoths like Facebook.