Interviews, insight & analysis on digital media & marketing

Misaligned objectives or market functioning as intended?

By Simon Halstead, Founder of Halstead Incubation Partners and a regular NDA columnist

A lot of attention was rightly given to the ANA study before Xmas, with the lost value in IVT and MFA sites, the number of resellers and loops that exist, and a long tail of domains of 80,000+. 

This year will see continued progress in terms of supply path optimisation, and reduction in partners to drive simpler / cleaner paths between demand and supply. But it also prompts the question – do misaligned objectives hold the industry back from our lofty goals? 

As an industry, we (almost) always want to do the right thing, but do the goals of our businesses (and the goals of our clients) support this? 

The media world we live in is multifaceted and complicated and as such, it’s sometimes hard to align our aspirations to the goals and requirements of each campaign or all publishers, in terms of effective delivery. We also face the challenge of information asymmetry, and a tendency to focus on the immediate path, versus looking at a broader picture. An individual campaign not bidding on an opportunity does mean the DSP doesn’t participate, or that the infrastructure fixed and variable costs of auction don’t occur. 

If we strip the industry back to the core, SSPs and exchanges are the intersection of 2 competing clients – Demand needs and publisher partners. That means the balance to give buyers the right access to supply, whilst filling the maximum volume at the highest price for a publisher. The publisher should remain the ultimate customer for an SSP/Exchange which could/should help the buyer get the most out of that publisher through discovery. 

Since 2017, we have seen a rapid expansion via header, TAM and open bidding. This has led to multiple bid requests for the same inventory set, and an exponential growth of volume. This in turn has led to strong growth in bid shaping and SPO, filtering impressions, and significant QPS capping. We may have made the pipes so large and repetitive, and then applied filtering to find de-duplicated quality, that everyone is focused onto the same inventory. This also leads to SPO deals to try to guarantee demand, and potentially distort the market further. Indeed, Ari Paparo talked about this in his last Marketecture podcast, both with a concentration of bidding to a small subset of the audience, and the need for a cookie to be present to bid. 

Let us not forget, that many of the measures of success are the purchasing inventory at the lowest cost, or delivering the CPA for a DSP – which is solely focused on its and the advertiser’s interest, and not the balance of value for a publisher. 

As I started prepping this article I asked a group of friends what they should write about, and got familiar threads – the impacts of AI, what ad tech will look like in 24 and 25, the death of the cookie, SSP commoditization and increasing Publisher Direct connections. But all of these things have an underlying single thread: are the goals achievable? Or are they facing competing and misaligned objectives? We think back to the original goals of programmatic, it was enabling buyers and sellers to directly connect with scale and efficiency. 

Are we achieving the original goals that we set out to within programmatic in terms of delivering efficiently? Or have we become a slave to our own technology making everything more complex in the process? Are we adding solutions to address inefficiency like Traffic shaping versus addressing the route causes of the problem? 

How do we overlay the challenge of publishers driving the most effective yield possible whilst managing multiple partnerships and SSPs and at the same time, having the most direct relationship possible with them? The ANA study talks about reducing the number of partners working with by a significant factor – from 44000 domains to 750 and being able to still achieve the same level of scale. It recommends that marketers reduce the number of partners, and have direct relationships with all actors – DSP, SSP and publishers – potentially scaling to only 5-10 SSP partners. 

In some ways, this appears logical and obvious – and a sign we have lost sight of how to build reach and frequency on campaigns versus an everything approach. Perhaps the move away from individual measurement and targeting can even allow us to reassess the start point of addressing cohorts and audiences. 

Many DSPs are focused on delivering the most effective price point, or necessarily the most effective path to a piece of inventory. 

But more importantly, as we see multiple hops and multiple loops, we need to understand what value each partner is playing in the chain and where that value is incremental or additive. 

If the market is functioning, then it should be driven by revenue that decides the number of partners as an effective strategy or not. The experience of publishers currently will be the addition of further supply platforms will deliver an increase in overall revenue. And so if we think about our drive to carbon neutrality, we have to think about the goals we’re able to provide for publishers to achieve the same and more revenue whilst working with reduced partners. 

If Demand will bid on an impression – does it not have a value and what is the trade-off for consumer experience? In the marketecture podcast, DotDash shared the focus on increasing speed and UX and seeing yield rise. Is the market functioning as it should? 

As we rethink the value of how we target, as we move to a more cohort-driven world, perhaps there is an opportunity to see that the focus on performance is also aligned with supporting publishers as business and minimising unnecessary repetition to align with our ESG ambitions. Inventory curation and closer partnership is a key theme for 2024.


More posts from ->