Today’s Supreme Court ruling upholding the TikTok ban underscores the urgent need for comprehensive action to protect children online.
The law banning TikTok – if not sold by its Chinese parent company – is largely centered on national security rather than the well-being of children. While this is a step towards addressing the harms caused by unregulated social media, it does not address the broader issues children are facing online.
Research consistently shows that TikTok — and platforms like it – use sophisticated algorithms to maximize engagement, often by keeping users, including children, on their apps for extended periods. These algorithms can expose children to age-inappropriate content, contribute to negative mental health outcomes such as anxiety, depression and poor self-esteem, increase the risk of problematic or compulsive device use, and are driven by enormous amounts of personal data. Policymakers must prioritize enacting robust regulations that ensure accountability and transparency in platform practices, particularly those related to algorithms, content moderation, and data privacy.
Tech companies and policymakers bear the responsibility for creating safer, healthier digital spaces by implementing child-centered design principles.