advertisement

YouTube is committed to the free-form, user-generated nature of its platform. It’s tough to reconcile that with a version that’s perfectly safe for kids.

How YouTube is trying to fix its Kids app without ruining it

[Photo: Hal Gatewood/Unsplash]

BY Jared Newman10 minute read

Malik Ducard, YouTube’s global head of family and learning, likes to tell a story about the time the YouTube Kids app clicked for him personally.

It was just before the app’s official launch in 2015, and his youngest son was playing with an early version during a family brunch. Unlike the regular YouTube app, YouTube Kids has a simple, cartoonish design that young children can easily navigate, along with various filtering tools and timers for parents. After the app streamed a clip of Phineas and Ferb, YouTube’s recommendation algorithm served a video on how to draw the characters. The restaurant produced some crayons and a paper placemat, and an impromptu YouTube art lesson broke out.

“It was great, because it was him engaging the app, and viewing the app, but him also being active with the app,” Ducard says. “With YouTube Kids, we are as proud of the engagement that we see as we are the action–people putting down the app and going out and playing.”

As a parent of young children, it’s hard for me to reconcile that rosy vision with my own YouTube Kids experience, in which the app is easily overrun by toy unboxing videos and other assorted junk food. Meanwhile, I’m always worried that its recommendations will travel down a dangerous path toward inappropriate videos that somehow slipped past YouTube’s automated filters, as has been known to happen.

Ducard’s excitement and my wariness have one thing in common, though: They’re both rooted in YouTube’s use of algorithms to determine what the app recommends and what appears in search results. That’s a starkly different approach to wrangling kid-friendly content than the more human-intensive curation adopted by Amazon for its FreeTime Unlimited service.

Over the past four years, machine-driven recommendations have been the source of numerous YouTube Kids-related scandals, from violent videos in search results to videos depicting self-harm, which might explain why the recent “Momo challenge” hoax–in which a ghastly creature was supposedly popping into children’s YouTube videos to encourage suicide–seemed credible enough to go viral, even though there was nothing to it.

But for folks like Ducard, those algorithms are also YouTube’s greatest source of potential as an educational tool, allowing kids to learn about things they’d never encounter in traditional media. That’s why YouTube Kids isn’t abandoning them, and why giving kids access to the app may always feel a bit risky.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Jared Newman covers apps and technology from his remote Cincinnati outpost. He also writes two newsletters, Cord Cutter Weekly and Advisorator. More


Explore Topics