At a Glance
- The American Academy of Pediatrics blames tech companies, not parents, for children’s digital safety
- Prolonged low-quality screen use linked to language delays, sleep issues, heart disease risk
- 9 countries examine Australia-style social-media bans for under-16s
- Why it matters: Tech’s engagement-first design is harming kids’ health worldwide
The American Academy of Pediatrics is demanding that technology firms and governments-not parents-take primary responsibility for protecting children from a digital ecosystem it calls “intentionally designed around engagement and commercialization.” In a policy statement released Tuesday, the leading U.S. child-health group outlined sweeping safety requirements for platforms targeting minors.
Health Toll of Engagement-First Design

The report catalogs documented risks of prolonged low-quality digital media use, including:
- Language delays, sleep disruption, anger issues
- Poor eyesight, weaker cognition, attention problems
- Elevated cardiometabolic risk-heart disease and type-2 diabetes
These effects stem from design features such as algorithmic recommender systems, autoplay, intermittent rewards, user profiling, friend suggestions, and social-approval metrics that drive extended use.
“Intentionally designed around engagement and commercialization, this ecosystem is shaped by industry incentives and lies largely outside of the control of individual families,” the academy wrote. “Many parts of the digital ecosystem have business models based on data collection and advertising revenue.”
Global Momentum for Age Bans
Australia last month became the first major country to bar under-16s from social media. Roughly 5 million accounts have since been removed, though some teens claim to have bypassed the restriction. The policy is viewed as a potential template for other nations.
Countries now weighing or planning similar bans:
- Denmark, Malaysia, Norway
- France, Spain, Germany, Greece, Italy
- The wider European Parliament
The United Kingdom appears closest to following Australia. Prime Minister Keir Starmer said this week that British ministers will visit Australia to study the ban, and the government is preparing action. In the United States, federal passage is viewed as unlikely given tech-industry influence, but California and Texas have voiced interest in state-level measures.
The academy itself stops short of endorsing an outright ban, noting that high-quality, ad-free educational content can foster pro-social behavior in preschoolers and kindergarteners. “Child-centered designs are achievable, better for society, and can lead to digital products that promote children’s well-being,” the report said.
Mandatory Safety Framework
To shift responsibility from parents to corporations and regulators, the academy prescribed concrete steps:
For Tech Companies
- Create child-safety teams reporting directly to leadership
- Test products for safety through youth collaborations
- Default settings must block autoplay, targeted ads, data collection, and chat features for minors
For Policymakers
- Fund third spaces-libraries, parks-as offline alternatives
- Finance child-centered media like PBS Kids
- Restrict social-media use in schools for distraction-free learning
- Apply product-safety rules now used for food, cars, and medical devices to digital platforms
“Regulatory agencies could require that digital media companies who intend to include minors report key metrics on their digital content and patterns of use. Safety and well-being metrics could be regularly incorporated with earnings reports,” the academy suggested.
The organization is preparing a separate policy statement focused on artificial intelligence’s impact on children.
Key Takeaways
- Health experts say current digital design is fueling measurable physical and mental harm in children
- Australia-style age bans are gaining international traction, with the U.K. likely next
- Pediatricians want safety-by-default features, not optional parental controls
- The proposed framework treats digital platforms like traditional products subject to safety oversight

