The First Digital Natives Speak Out: Why Gen Z Wants Adults to Stop Defending Social Media
A generation raised on Instagram and TikTok is rejecting the platforms that shaped their adolescence—and they want policymakers to listen.

A generational rift is emerging in the debate over social media regulation—but not the one most observers expected. While tech executives and free-speech advocates warn that government oversight threatens digital liberty, many of those who grew up entirely within the social media ecosystem are asking a different question: why did adults let this happen to us?
Recent court rulings in the United States have lent judicial weight to what researchers and mental health professionals have documented for years: social media platforms are not neutral communication tools. They are engagement engines designed to maximize user attention, often at the expense of psychological wellbeing. For the first cohort to reach adulthood without memory of a pre-smartphone world, that distinction has become impossible to ignore.
The timing of this reckoning reflects a broader maturation within digital culture. The oldest members of Generation Z—those born in the late 1990s—are now in their mid-to-late twenties, old enough to reflect critically on formative experiences that older generations observed only from the outside. What they describe bears little resemblance to the utopian vision of connection and democratized expression that Silicon Valley promised.
The Architecture of Addiction
According to research published by the American Psychological Association, adolescents who spend more than three hours daily on social media face double the risk of experiencing symptoms of depression and anxiety compared to non-users. The mechanisms are well-documented: algorithmically curated feeds that prioritize emotionally charged content, notification systems engineered to trigger dopamine responses, and infinite scroll features that eliminate natural stopping points.
These design choices were not accidental. Internal documents from Meta, released during congressional hearings in 2021, revealed that the company's own researchers warned executives that Instagram was harmful to teenage girls' body image and mental health. The platform proceeded with minimal changes to its core engagement mechanics.
Young adults who experienced these systems during their developmental years describe the impact in terms that transcend typical generational complaints about technology. They speak of algorithmically induced anxiety, of comparison cultures that made authentic self-expression nearly impossible, of hours lost to platforms that left them feeling emptier than before they logged on.
A Shift in the Regulatory Landscape
The legal environment is beginning to reflect these concerns. U.S. courts have recently upheld state-level restrictions on social media companies' data collection practices targeting minors, rejecting industry arguments that such regulations violate First Amendment protections. These decisions represent a significant departure from the hands-off approach that characterized tech regulation for two decades.
The rulings have emboldened lawmakers across the Asia-Pacific region to consider similar measures. Australia's eSafety Commissioner has proposed age-verification requirements for social media platforms, while New Zealand's Parliament is reviewing legislation that would impose duty-of-care obligations on companies whose services are accessible to minors.
Industry groups have responded predictably, warning that aggressive regulation will stifle innovation and fragment the global internet. But these arguments carry less weight among those who grew up as the subjects of Silicon Valley's grand experiment in attention capture.
Beyond Parental Responsibility
One of the more persistent deflections in the social media debate holds that platform harms are primarily a failure of parenting—that responsible adult supervision can mitigate the risks these services pose to young users. This framing conveniently shifts accountability away from the companies that design psychologically manipulative systems and onto individual families.
Young adults who navigated adolescence on these platforms reject this narrative. They point out that expecting parents to counteract billion-dollar engagement optimization systems is both unrealistic and unfair. Many of their parents had no framework for understanding the psychological mechanics at play, having come of age in an era when "going online" was a discrete activity rather than an ambient condition of existence.
The comparison to other regulated industries is instructive. Society does not rely solely on parental supervision to protect children from harmful products in food, pharmaceuticals, or automobiles. We establish safety standards, require transparency about risks, and hold manufacturers accountable when their products cause harm. The question increasingly posed by digital natives is: why should social media be different?
Economic Incentives and Structural Reform
At the heart of the social media problem lies a business model that treats human attention as an extractable resource. Platforms generate revenue by selling access to users' attention to advertisers, creating a structural incentive to maximize engagement regardless of psychological cost. This is not a bug that can be fixed with better content moderation or improved parental controls—it is the fundamental architecture of the industry.
Some advocates are pushing for more radical interventions. Proposals include banning advertising-based business models for services accessible to minors, requiring platforms to offer chronological (rather than algorithmic) feeds as a default option, and imposing strict limits on data collection from young users. These measures would fundamentally alter the economics of social media, which may be precisely the point.
The tech industry's resistance to such reforms is predictable but increasingly untenable. As the first generation of digital natives reaches positions of influence in media, government, and civil society, the industry's preferred narrative—that social media is simply a tool, neutral in its effects and dependent entirely on how users choose to engage with it—is collapsing under the weight of lived experience.
The Path Forward
What remains unclear is whether the regulatory response will match the scale of the problem. Incremental reforms—age gates, content warnings, enhanced reporting mechanisms—may provide political cover without addressing the underlying incentive structures that make social media harmful. Meaningful change would require confronting the business models that have made these platforms among the most valuable companies in human history.
For young adults who grew up in the social media age, the stakes are both personal and societal. They watched their own mental health deteriorate in real time, observed friendships mediated through platforms designed to maximize conflict and comparison, and came of age in a information environment optimized for virality rather than truth. They are not asking for protection from ideas or uncomfortable speech. They are asking why profit-maximizing corporations were permitted to conduct a massive, uncontrolled experiment on their psychological development.
The answer to that question will determine whether the next generation of children grows up under the same conditions—or whether democratic societies finally assert that some things matter more than engagement metrics and advertising revenue.
More in business
The storied British magazine is putting faces to bylines as it bets on multimedia expansion and a younger audience.
The fast-food giant's May menu addition signals a seismic shift in beverage preferences that's reshaping the industry's $50 billion drink market.
Finance ministers arrive in Washington facing the sharpest energy shock since the 1970s as regional conflict threatens global recovery.
Traders brace for volatile week as Iran negotiations coincide with unexpected movement toward Russia-Ukraine peace deal.
Comments
Loading comments…