Interviews, insight & analysis on digital media & marketing

Q&A: Simon Wistow of Fastly talks ad infrastructure at the edge

Publishers are at a critical juncture, with global ad expenditure expected to surpass $800bn by the end of 2025 alongside mounting privacy enforcement – from the ICO and Ofcom to global browser bans – pushing traditional client-side ad delivery to breaking point. 

Trust in ad measurement is falling, signal quality is collapsing and the open web is under increasing pressure to demonstrate compliance, speed and monetisation.

Simon Wistow, Co-Founder and VP Strategic Initiatives at edge cloud platform Fastly, believes that we are entering a new phase of advertising infrastructure, one where publishers must bring consent handling, identity and auctions back into their own server environments to thrive. 

New Digital Age spoke to Wistow to find out more…

Tell me about Fastly and what led you to co-found it?

I’ve had an eclectic career, including early internet advertising in the ’90s, mobile games, search engineering at Yahoo, even visual effects on the Harry Potter films. But the common thread has always been taking large, complicated systems and making them fast and efficient.

Fastly was born out of sheer frustration in 2011. Back then, Content Delivery Networks (CDNs) were black boxes.* You put content in and hoped it worked. I once spent a 17-hour day trying to fix a one-character error in a config file. We realised we needed a CDN that could deal with rapidly changing, dynamic, personalised content, update instantly, and give you real-time visibility. That’s when the light bulb moment hit: what if you could actually run code at the edge?

Why do you believe we’re entering a new phase of advertising infrastructure?

Customers are deeply worried about the changes in cookie behaviour and how they’ll keep making money from advertising. It reminds me of the search engine wars at Yahoo; the tension between wanting traffic from Google while fearing you were giving them too much power. Now it’s the same with AI crawlers: publishers want them, but also worry users won’t come back to the primary source.

Even big media companies don’t have the time to engage with standards bodies or negotiate with the big AI crawlers. So we’re stepping in as the middleman by implementing emerging standards like Really Simple Licensing from the IAB or IETF, and making sure our customers have protections without needing huge budgets or specialist teams.

Nobody loves pages overloaded with JavaScript ads. It just slows everything down. We’re working with agencies and ad tech partners to make ads more efficient so publishers can actually do more with less.

Right now, everybody’s worried: advertisers, publishers, the crawlers themselves. But what’s encouraging is they’re all turning up to standards meetings. It’s collaborative. People are experimenting with licensing deals, attribution requirements, or special endpoints for crawlers. It reminds me of the early 2000s when we got things like sitemap.xml – it feels like we’re heading toward that kind of standardisation again.

Are there any other market or tech trends bubbling up that you think will become more important?

Ten years ago, the expectation was you’d pick one hyperscaler, be it AWS, Azure or Google Cloud, and build everything there. Now we’re seeing multi-region, multi-cloud, stitched together with third-party services, and increasingly, on-prem AI workloads for privacy or GPU access. Customers want observability across all of it, and stitching it together efficiently is where we come in.

Meanwhile, streaming keeps growing. Instead of one family TV, it’s four people streaming different things on phones, laptops, smart TVs. Bandwidth demand is rising faster than supply, and during big events like the Olympics it’s a real problem. The internet feels infinite until you run a large network like ours, you quickly learn its physical limits.

Finally, it’s worth noting that the scale of DDoS attacks an average site faces today would have made Google blanche 15 years ago. You can’t just buy a hardware box anymore – that model feels old-fashioned. Security has moved to the edge, close to where the attacks originate. 

What role will AI most likely play in the near future?

AI is everywhere and at every stage of the Gartner Hype Cycle at once – the peak of inflated expectations, the trough of despair, and the plateau of productivity. We’ve used AI internally for years for things like anomaly detection, routing, image optimisation. The real question is whether the current hype is a bubble that bursts, deflates, or becomes the new normal.

Of course, the potential environmental cost of AI is massive. Honestly, the only way to save energy is to do less. That’s why we built our semantic cache, which is an AI accelerator that recognises when two different queries mean the same thing, so you don’t need to re-run expensive models every time. It’s milliseconds instead of seconds, and it saves huge amounts of energy.

There’s a temptation to throw AI at everything, but you don’t need to. The future will be about mixing techniques like caching, smaller models, and reuse, just as search engines evolved with multiple specialised indexes. Efficiency is going to matter as much as accuracy.

* Editors note – A  Content Delivery Network (CDN) is a distributed system of servers that caches website content on servers located closer to end-users, reducing load times and energy consumption while improving performance.