According to CNBC, YouTube CEO Neal Mohan, who was just named Time’s 2025 CEO of the Year, admitted in an interview that he and his wife strictly limit their three children’s time on YouTube and other social media platforms. Mohan said they are “more strict” on weekdays and less so on weekends, following a philosophy of “everything in moderation.” His comments come as experts like NYU professor Jonathan Haidt continue to warn about the harms of excessive smartphone and social media use on kids, advocating for no smartphones before 14 and no social media before 16. This week, Australia became the first country to formally enact a law barring users under 16 from accessing major social platforms, a move backed by 77% of Australians in a 2024 YouGov survey. Mohan also stated he feels a “paramount responsibility” to young people and wants to give parents more control, pointing to the child-friendly YouTube Kids app launched in 2015.
The Executive Hypocrisy Question
Look, here’s the thing that always gets me. We keep seeing this pattern, right? A tech CEO builds a massively engaging, arguably addictive platform, profits enormously from it, and then turns around and says, “Oh, not for my kids.” It happened with Steve Jobs limiting tech use, it’s common in Silicon Valley, and now the head of YouTube is doing it. It feels… off. Doesn’t it? I mean, Mohan is talking about feeling a “paramount responsibility” while also admitting the product he oversees needs to be rationed in his own home. That’s a pretty stark confession about the inherent nature of the service. It’s not that he’s wrong to limit his kids—he’s almost certainly right. But it highlights a massive disconnect between the product’s design for maximal engagement and its suitability for young, developing minds.
Australia’s Bold And Messy Move
So Australia just went and did it. They’re the first country to actually pass a law saying kids under 16 can’t get on major social media. The public support was huge—that YouGov poll showing 77% backing is no joke. But now comes the hard part: enforcement. How do you actually do this? Age verification online is a notoriously tough nut to crack. Do you use government ID? Facial recognition? It’s a privacy minefield. And you can bet platforms will push back. The rollout is already facing resistance. This is a classic case of a policy that sounds great and simple in theory but gets incredibly messy in practice. Still, it sets a huge precedent. Other governments are watching.
The Moderation Trap
Mohan’s solution, both for his family and for YouTube’s public policy, is “everything in moderation.” And on one level, that’s perfectly reasonable. It’s the classic parental approach. But in the context of platforms engineered to break moderation, is it a realistic ask? Jonathan Haidt’s argument, which you can hear in his various talks, is that these aren’t neutral tools. They’re super-stimuli. Asking a kid to practice “moderation” with TikTok or YouTube Shorts is like asking them to eat just one Pringle. The platform’s entire business model is built on defeating that intention. So while Mohan’s heart might be in the right place, and tools like YouTube Kids exist, the core tension remains unresolved. The CEO’s own domestic rules prove the product’s default state isn’t safe for unlimited consumption.
What’s A Parent To Do?
Basically, we’re left in a weird spot. The experts are sounding alarms, the CEOs are locking down their own homes, and governments are starting to swing the regulatory hammer. For the average parent, it’s confusing and exhausting. Mohan says he wants to make it easy for all parents to manage use, and that’s a good goal. But it also feels like passing the buck. The burden of managing a billion-dollar attention extraction machine falls on the individual household. The Australian law is a brute-force attempt to shift that burden back to the companies and the state. I don’t know which approach is right. But the fact that the conversation is moving from “parental guidance” to “outright bans” tells you how severe the problem has become. The people building these worlds don’t want their own kids living in them without strict borders. That should tell us everything.
