While AI is helping reshape the craft of media planning, the second half of the New Digital Age and Nano Interactive roundtable turned to a more sobering conversation. (Read Part One here.) What does all this progress mean for ethics, regulation and the broader social contract between technology and humanity?
“AI isn’t just another planning tool,” said Ed Cox, Partner at Beyond. “It’s a systemic shift that touches everything — from who we hire to how we influence culture.”
A key worry among the panellists was the potential for a two-tier internet. “If 50 percent of content is AI-generated, where do brands go for quality?” asked James Shoreland, CEO of VCCP Media. “We risk an ecosystem where only those who can afford premium platforms get real visibility, while the rest navigate an ocean of synthetic noise.”
Emma Withington, Chief Planning and Strategy Officer at Havas, linked this to wider socioeconomic concerns. “We’re optimising ourselves into irrelevance,” she said. “If automation strips away people’s income, who are we marketing to?”
The conversation naturally turned to the role of regulation. Pius Hornstein, Global Head Digital Global Business Units at Sanofi, explained how his team is establishing internal guardrails. “We’ve created independent compliance bodies to review AI projects,” he said. “It’s not just legal risk, it’s ethical integrity.”
Niall Moody, Chief Revenue Officer at Nano Interactive, agreed. “Just because something is possible doesn’t mean it’s responsible,” he said. “It’s on us to act ethically before regulation catches up.”
But others expressed scepticism that the regulatory environment could move quickly enough. “Governments don’t have the incentive or the speed,” said Ed Freed, Global Transformation Officer at Rapp. “And platforms won’t self-regulate unless there’s a business reason to do so.”
Erfan Djazmi, Chief Digital Officer at Mediahub Worldwide, pointed to the deeper structural tensions. “There’s a conflict between the commercial race to monetise AI and the human need to feel secure and valued,” he said. “We’re sprinting, but many don’t know what they’re running towards.”
This sprint has consequences, especially for talent. “We’re seeing a clear divide between those who are comfortable with AI and those who aren’t,” said Withington. “Some women feel it’s cheating to use these tools — that’s a cultural barrier we need to address.”
Deborah Gbadamosi, IAA UK, stressed the financial disconnect. “We still reward volume and busywork over strategic insight,” she said. “If AI is going to take the grunt work, we have to figure out how to value the brains behind the strategy.”
Shoreland added that media’s current model is already unsustainable. “The industry thrives on volume — more impressions, more cash,” he said. “But AI shifts that logic. Are we ready to reward quality instead of quantity?”
Moody raised an urgent point about content visibility. “Everything we create now is a signal for language models,” he said. “We need to train them with diversity and purpose, or we’ll just reinforce the same biases we’re trying to escape.”
The group agreed that ethical responsibility cannot be outsourced. “If you’re letting Meta optimise your entire media plan, you’re also trusting them with your values,” said Cox. “And you may not like where that ends up.”
Niall Moody, Chief Revenue Officer at Nano Interactive, offered a more optimistic view. “AI is allowing us to intervene earlier, to shape strategy rather than just respond,” he said. “It’s about reframing the question, interrogating with human-informed insight, not just automating the answer.”
Wallace described a moment of breakthrough. “We built a brand in under an hour at South by Southwest,” she said. “The tools are there — now we need to make sure we’re using them responsibly.”
Hornstein remained cautiously hopeful. “If we implement AI with intention, in five years we’ll be working smarter, not just faster,” he said.
As the discussion closed, Gbadamosi captured the broader tension perfectly. “We’re at a fork in the road,” she said. “We can automate, accelerate and alienate — or we can build something that’s more thoughtful, more inclusive and, crucially, more human.”







