2025 is off to an exciting start. A new presidential administration has taken office, and the 119th Congress is in session. TikTok faces an uncertain future after a law requiring ByteDance to sell TikTok went into effect, and the President issued an executive order delaying the enforcement of the law. Key pieces of legislation including the Kids Online Safety Act (KOSA) and the Children and Teens Online Privacy Protection Act (COPPA 2.0) failed to pass in the previous Congress. While it remains uncertain what form these bills will ultimately take, Senator Blumenthal, one of KOSA’s original sponsors, has vowed to continue his efforts this Congress.
New state laws regulating – or outright banning – social media for minors have either recently gone into effect or will soon. These laws are already facing legal challenges, as courts seem cautious about age verification requirements but remain open to their potential if implemented thoughtfully. Rulings on these laws could determine the fate of age-based online regulations across the country. Meanwhile, many states are also considering school cell phone bans or have implemented them already, and policymakers across the political spectrum are grappling with how to regulate artificial intelligence.
Social media companies are also responding to this evolving landscape with changes to their own policies, as demonstrated by Meta’s shift away from fact-checkers to a model similar to X’s community notes. These developments reflect the rapidly changing environment in which we operate, signaling an eventful year ahead.
Children and Screens’ Policy Efforts for 2025
What does this mean for Children and Screens’ work in policy? Last November, Children and Screens submitted a transition memo to the Trump-Vance administration. In 2025, we will continue advocating for these priorities at both the state and federal levels. Our efforts focus on three primary domains:
-
- Advancing legislative reforms similar to KOSA and COPPA 2.0 and key components.
- Fostering platform transparency and supporting public interest research.
- Strengthening regulatory frameworks to ensure robust enforcement of existing and future laws.
Of course, these initiatives will be grounded in the best available empirical evidence, which serves as the foundation of all our work.
While the stall in the passage of key legislation in 2024 was disappointing, we remain optimistic. This Congress, we will build on the momentum gained and the exposure that nationwide debate brought to key issues in child online safety. Recent conversations with Republican leadership suggest there are opportunities for pragmatic bipartisan legislation to move forward. We are already engaging with key members of Congress and remain at the ready as relevant committees organize and begin their work.
State-Level Efforts
Last year, we testified in front of the Vermont legislature in support of their age-appropriate design codes. Although it was ultimately vetoed, its passage in the legislature demonstrates growing momentum. Similarly, while California’s age-appropriate design codes are under preliminary injunction, Maryland’s version has taken effect without challenge. We see this success as an opportunity to help states develop constitutionally-sound legislation and will continue to support such efforts nationwide.
As school cell phone bans gain traction, we are actively involved in aligning policies with research-based best practices and encouraging robust tracking of their outcomes. Furthermore, we see opportunities to advance state-level laws designed to restrict personalized algorithms to minors, building on successful laws which have been upheld in New York and California. We are collaborating with stakeholders to advance similar legislation and strengthen enforcement measures across the board.
Key Issue for 2025: Age Verification
Age verification will likely be a defining issue in 2025. Courts have signaled that unless there are non-invasive, low-burden methods of verifying user age, strong age verification requirements risk violating free speech. However, technology is evolving rapidly, and other countries, including the United Kingdom, Australia, and European member states, are all actively implementing age verification to enforce their own laws. We are closely monitoring these developments to identify effective strategies for adopting age verification domestically.
The Role of Industry
Meta’s recent elimination of fact-checkers may signal broader shifts in platform policies in response to the new administration and Congress. This could usher in an era of decreased content moderation, raising concerns about implications for youth mental health and well-being, as well as access to reliable information online. While platforms have demonstrated the ability to adapt policies based on a variety of factors – as we saw with Instagram Teen accounts – they can just as easily reverse course. Internally created and enforced policies are non-binding and can be changed on a whim. Without robust legislation, there are no guarantees that platforms will maintain or adhere to positive policy changes.As shown in our UK Age-Appropriate Design Code Impact Assessment, companies will comply with regulations when they are clear and enforceable.