Protecting children has always been a shared responsibility in Alabama. It starts at home, is reinforced in our churches and schools, and carries through to the choices we make as a state. When lawmakers return to Montgomery this winter, the conversation around kids’ online safety should reflect those principles, not defer to the preferences of social media companies that have repeatedly failed to put children first.
Families across our state are rightly concerned about the role social media plays in their children’s lives. We have seen how platforms designed to capture attention can instead fuel anxiety, isolation, and harm. These concerns are not speculative. Internal documents and court filings have shown that Meta – the parent company to Instagram and Facebook – knew its products could negatively affect young users and, in many cases, chose growth over meaningful reform. Newly unsealed lawsuit filings allege Meta downplayed or buried evidence about risks to children on its platforms, even as those harms became more widely understood.
That history matters as states consider new legislation aimed at protecting kids online. A proposal brought forward last session, and heavily backed by Meta, would shift responsibility away from social media platforms and onto app stores while collecting detailed information about every app a child uses. At first glance, that approach may sound reasonable. In reality, it misses the heart of the problem. Alabama families are not asking for more data to be gathered about their children. They are asking for safer products, better design choices, and real accountability from the companies creating these digital spaces.
Under the proposed framework, platforms like Facebook and Instagram would face fewer direct obligations to fix the systems that expose kids to harmful content. Instead, oversight would be pushed elsewhere, allowing the very companies that profit from engagement to sidestep responsibility. That is a troubling direction, especially given the reporting that Meta has limited or suppressed internal research related to youth safety. According to these reports, safety research related to Meta’s virtual reality products was slowed or sidelined when it conflicted with business objectives.
Scripture calls us to be good stewards of what is entrusted to us. That includes our children and the environments they grow up in. Stewardship does not mean handing more influence to companies that have already shown they struggle to self-regulate. It means setting clear expectations, placing responsibility where it belongs, and insisting that powerful platforms answer for the impact of their choices.
Alabama has the opportunity to pursue a better path. One that puts families first, keeps parents in the driver’s seat, and holds social media companies accountable for the products they build and promote. We should be cautious of solutions that sound protective but ultimately expand the reach and influence of the very platforms that created this problem in the first place. Our children deserve better, and Alabama lawmakers can help lead the way.









































