Published Thursday, December 11, 2025
Every few months, a new proposal surfaces to ban social media for everyone under a certain age. Different countries and U.S. states call for different age limits, but the pattern is the same: when adults get scared, they reach for the cleanest, simplest solution they can grab. Ban the thing. Shut it off. Limit social media access. It looks decisive in a headline, and it feels like action. But policymakers keep returning to the same question, “Should we ban social media?” and, in doing so, they overlook a deeper truth: broad policies designed to protect children through a blanket ban rarely account for the complexity of what they’re trying to regulate.
After spending years inside this problem, as a parent, a tech co-founder, and a former U.S. Army intelligence officer trained to think in systems, I can tell you this ban won’t do what people hope. And the more you study the data, the clearer the pattern becomes: we’re trying to solve a complex, psychological, generational shift with a single on/off switch.
Every major study on technology and kids points to one truth: the relationship is complicated.
Earlier this year, the massive ABCD Study (Nagata et al., 2025) found correlations between higher social media use and lower reading and memory scores. Cue the global panic.
And yet most public conversations still frame social media as a universal threat to kids’ mental health, instead of looking at the complex conditions that shape how teens actually use these tools.
But the researchers themselves stressed the keyword: correlation. They said that it’s likely that kids who already struggle may simply turn to screens more.
Six years ago, Oxford’s Orben & Przybylski (2019) analyzed data from 350,000 adolescents. It’s the largest study ever done on kids and technology.
And they found the effect of digital technology on mental health was “minuscule… smaller than eating potatoes.” Follow-up studies have shown that moderate screen time may even correlate with better psychosocial functioning (Przybylski et al., 2020).
So if the issue is complicated, then we can’t say it’s the screens, or the social media platforms, or the age of the kids, because all of these have been studied and never conclusively proven.
It’s how kids use screens. It’s why kids use screens. It’s what’s happening at home, at school, and inside their peer groups.
A ban can’t solve that because a ban doesn’t understand that.
Strong policies may try to protect kids, but they often lean on tools like age restrictions or require age verification rules that assume the problem is the platforms themselves. The truth is more human than that.
Underage drinking didn’t disappear when we started checking IDs; it simply changed venues. Kids moved from convenience stores to basements, backyards, and older friends’ houses. The behavior didn’t stop; it just went underground.
Today’s attempts to require age verification work the same way: they shift behavior instead of reducing risk.
And the internet isn’t any different. Block one doorway, and teens don’t shrug and give up — they find a new route around it. They’ll use VPNs, burner social media accounts, private browsers, side apps hidden in calculator folders, overseas online platforms with zero safeguards… the list of workarounds is endless, and teens, especially underage users, stay a step ahead of whatever barricade adults erect.
So a ban doesn’t reduce harm. It redistributes it — pushing kids into darker digital alleyways where adults have less visibility, less influence, and far fewer opportunities to intervene before real harm happens.
It’s the same pattern we see when lawmakers try to control social media platforms through sweeping rules that require social media companies to enforce new limits, or when we see a bill passed that attempts to mandate how companies operate. These moves often spark unintended consequences, and the risks migrate instead of disappearing.
Lawmakers can draft an online safety act or require platforms to add new restrictions, but if kids want to be somewhere online, they will be. And the more rigid the rules, the more they incentivize kids to step outside the spaces where any guidance or support is even possible.

New bills that require social media companies to tighten age-verification rules offer a band-aid solution — a clean headline that ignores how underage users actually move online.

New bills that require social media companies to tighten age-verification rules offer a band-aid solution — a clean headline that ignores how underage users actually move online.
That’s how we end up with rules that create friction for families while doing very little to meaningfully address safety — and in some cases, they even collide with First Amendment protections without making anyone safer.
I get the impulse. I’m a dad. I’ve watched my own kids feel the gravitational pull of constant stimulation.
But here’s the truth most policymakers ignore: screens aren’t rewiring kids’ brains — they’re reshaping attention in the same way every new medium throughout history has.

We’ve been here before. New technology appears, adults panic, and headlines claim the sky is falling — a pattern now repeating with social media and age restrictions.

We’ve been here before. New technology appears, adults panic, and headlines claim the sky is falling — a pattern now repeating with social media and age restrictions.
Every generation believes this time is different, and this time, kids really are doomed. But so far, every single generation has been wrong.
The new wave of state laws and online safety proposals built on fear make the same mistake.
They assume that tightening age restrictions or raising the minimum age will solve the problem, and that any new rule that requires social media platforms to enforce stricter policies will somehow change how kids behave.
But no regulation aimed at social media companies has ever replaced what actually keeps kids safe — connection, conversation, and adults who stay close enough to see what’s happening.
Kids don’t follow rules because we wrote them down. They just find a different way in.
So when lawmakers lean on age verification, insist platforms verify users’ ages, or demand required parental consent, the policy reads tough but lands soft. Fear also pushes adults toward limits on free speech or louder calls to protect minors, but none of that replaces the only thing that’s ever worked: staying connected to kids while they grow up in a world that won’t stop changing.
And even when we see a bill passed that tries to block users under 16 from certain spaces, or when the rules drift into First Amendment territory, the results are the same: more friction, less connection, and no meaningful change in how kids actually navigate their lives online.
What kids do on social platforms activates the same core storytelling circuits their brains evolved for learning. For a lot of teens, social media isn’t escapism. It’s identity formation. It’s creativity. It’s communication. It’s how they practice being human.
And for LGBTQ+ youth especially, online connection is sometimes the only safe place to breathe.

For many LGBTQ+ young people, online communities offer the connection and safety they can’t always find offline — a reminder that not all social media spaces are harmful, and some are essential for real online safety.

For many LGBTQ+ young people, online communities offer the connection and safety they can’t always find offline — a reminder that not all social media spaces are harmful, and some are essential for real online safety.
Banning the spaces where teens explore themselves doesn’t actually protect minors. It cuts them off from the places where identity and connection take shape.
At Cyber Dive, we built the Aqua One smartphone specifically to protect kids at the point of risk. I’ve seen what actually happens—like a 14-year-old who connected with someone on Roblox claiming to be another teen. When the chat escalated to video, Aqua One’s nudity prevention locked the device instantly, alerted the parent, and provided evidence for police.
A ban wouldn’t have saved that child. Because if it wasn’t Roblox, it would have been another app, another loophole, another corner of the internet that sits outside whatever rules policymakers try to put on online services. Predators don’t care about bans. Algorithms don’t care about state laws. And the young people we worry about most will always find a way around the walls adults put up.
The only effective defense — the only thing that has ever worked — is staying close enough to see what kids are actually doing and guiding them through the parts that scare us.
We keep trying to write laws that outsource parenting to firewalls. But every time we do that, we leave teens alone in the very spaces where they most need us.
The answer doesn’t come from stricter age restrictions or another attempt to protect our kids through paperwork. And it doesn’t come from a new rule that asks or requires social media companies to throw another technical barrier in front of them.
But study after study shows the real protective factor isn’t bans, filters, or time limits — it’s warm parental involvement, open conversation, and gradual autonomy. Kids who know what healthy digital behavior looks like use platforms differently. They create more, compare less, and seek validation elsewhere. And none of that comes from policy.
You can’t legislate that. You have to live it with them.
There’s no silver bullet here… not for parents, not for lawmakers, not for tech companies. But there are approaches backed by research, child-development science, and real-world experience with families.
Kids don’t get safer when parents are blind. They get safer when parents can see what’s happening — messages, searches, photos — so they can step in early. Visibility gives parents context, not control, and it works far better than broad attempts to reshape social media services through sweeping requirements or rules that require social media platforms to enforce one-size-fits-all safeguards.
Kids don’t need a security guard; they need a guide. They need adults who talk with them about what they’re seeing, who’s contacting them, and what healthy online behavior looks like. They need adults who accept that there will be things kids do online that freak us out, just like we did things that freaked our parents out. Someone who understands that even the strongest parental controls can’t replace open conversation.
When policymakers push solutions like age assurance, age verification, or rules that would require online platforms to screen every new account holder, they’re assuming the problem is technical. It isn’t. Kids don’t get safer because a platform flagged a user’s age or enforced a new parental consent provision. They get safer because an adult is present enough to help them make sense of what they’re experiencing.
Kids copy what we do, not what we say. If we scroll through dinner, they learn scrolling through dinner. If we set boundaries for ourselves, those boundaries become normal — not punishment — and far more effective than relying on new laws, tighter age restrictions, or abstract attempts to protect our kids from behind a screen.
None of these approaches are perfect. But they’re grounded in how kids actually behave, not how adults wish they behaved. And they avoid the trap of assuming another online safety amendment, tougher civil penalties, or stricter efforts to obtain parental consent can solve what is ultimately a relational problem, not a technical one.
Banning social media for everyone under sixteen may sound bold, but it won’t fix the underlying issue. It assumes all kids are the same, all online spaces carry the same level of risk, and that parents have no role beyond approving apps or signing off on parental consent forms.
It takes a deeply nuanced problem — development, identity, attention, relationships — and tries to solve it with a single on/off switch. Policies built on broad age restrictions, demands to verify users ages, or new requirements that require online platforms to check every account holders’ details are attempting to do something legislation has never been good at: replacing judgment, presence, and connection with process.
When new rules lean on age assurance technologies, tighter attempts to protect children, or efforts to limit certain features for young Americans because of potentially harmful content, they often create gaps no one intended. Overly rigid settings can even block moments of genuine connection or interfere with normal protected expression and everyday lawful speech, without actually making kids safer.
And when platforms are told to implement age verification, enforce required parental consent, or fold themselves into whatever social media act or kids online safety act happens to be moving through the system, families end up drowning in process. We get forms to fill out, boxes to check, friction at every turn — but not the thing that matters most: real connection between kids and the adults who guide them.
The uncomfortable truth is this: safety doesn’t come from taking technology away. It comes from staying connected while kids learn to use it in ways that scare and surprise us. The harder we try to wall them off from the digital world, the more we guarantee they’ll learn to navigate it in secret — alone, unprepared, and without us beside them.

Derek Jackson
I’m always chasing the next challenge—whether it’s deep in the woods, in the pages of a new book, or at the forefront of innovation. As a dad of three and an Army veteran, I’ve built a life around problem-solving, adaptability, and thinking ahead. Before co-founding Cyber Dive, I led a team of intelligence soldiers in analyzing and targeting ISIS and other radical insurgents who used social media to spread propaganda and recruit foreign fighters. Now, I’m bringing that same expertise to parents, cutting through the noise to give them the information they need—whether they’re ready for it or not.
Type 3 Achiever / INTP Logician
The internet moves fast—predators, loopholes, and digital dangers evolve daily. We track it all so you don’t have to.
No fluff. No fearmongering. Just the truth you need.
🔒 Join thousands of informed parents—sign up now.


© Cyber-Dive Corp. 2025

