Interviews, insight & analysis on digital media & marketing

The digital ad industry must accept its responsibility to help protect children online

by Phil Cowdell, Chief Strategy Officer, Channel Factory

In 2021, 4,752 children died from gun-related injuries in the United States, a country where a gridlocked Congress fails to act year after year. But yesterday, Congress again summoned social media platform CEOs to Capitol Hill, where Lindsey Graham (R, S.C.) told them, due to their failure to protect kids online, “you have blood on your hands.” The bitter irony is enough to make any tech exec dismiss the hyperbole—but we shouldn’t.

The spectacle of these high-level hearings is off-putting. Politicians grandstand and tech execs obfuscate while we watch from a distance. But we’d be wrong to take the attitude that all of this happens above our pay grade. We in the ad tech industry are not only in a position to effect great change, but we have the moral responsibility to do so.

The media ecosystem is just that—a world of interconnected players each dependent on the next. Social platforms rely on creators to draw attention but take on so much content that not even they can police it all. Indeed, X CEO Linda Yaccarino said X suspended 12.3 million accounts in 2023. Sou Zi Chew, CEO of TikTok plans to invest $2 billion in trust and safety measures for US users. But even these seemingly titanic efforts are not adequate. 

The advertising and ad tech industries supply these platforms with life-giving revenue. As the ecosystem’s foundation, it’s our responsibility to reengineer it in a way that protects children online. As an industry, why can’t we? We are already experts at protecting brands from unsafe content. We deploy keyword blocking and exclusion lists to make sure ads appear nowhere near the most deplorable content. A laudable side effect is that this content is subsequently demonetized and often removed. That’s not enough.

Mission: inclusion

We need to make protecting kids—and all vulnerable communities—central to our industry mission. And we can do it by amplifying positive content and constraining negative content. Here’s how.

Rather than relying on exclusionary tactics alone, we apply inclusion strategies that fund content that makes the internet a safer and healthier place for children. When we work to align brands with positive content, we give diverse creators a platform to discuss important topics like mental health that can improve outcomes for our youth. And we allow brands to amplify their purpose and improve their reputation among consumers they care about.

Certainly, we should continue to block the IAB’s and GARM’s objectively brand-unsafe categories. But we will also create more thoughtful keyword blocking and other exclusion methodologies so that we are not demonetizing content on important issues like race and religion that help people of all ages find community. We can also pay closer attention to the kinds of trending content that contribute to a toxic online environment and work together promptly to demonetize it.

In the weeks and months that follow, global governments and platforms will continue to address the issue of child safety online. If we in the advertising industry simply sit back and read the coverage, shaking our heads but doing nothing, we preserve the status quo that allows kids to get hurt. Creating change is very much within our grasp. We don’t need Zuck’s direct line to do it.