Adam Chugg is Head of Big Tech Activations at the7stars and NDA’s new monthly columnist.
Whilst working at Meta, behavioural economist Julian Runge conducted research that concluded only 12.6% of a 6,777 sample of businesses had utilised randomised controlled experiments. This is astounding considering the improvements in performance that can be gained through such experimentation. The same research highlighted that those businesses who run just 15 experiments a year see about 30% higher ad performance.
The size of the prize is large.
Having recently collaborated with Meta on a project exploring the optimal conditions for successful and meaningful experimentation, we looked at how this can be used to affect organisational change; specifically, what are the common barriers, what is the optimal process, and how do we make sure what we learn is actionable.
In doing so, we assessed what often makes experiments and change fail, just as much as what makes them successful.
Key Ingredients of successful experimentation
Almost all marketers should be familiar with the key ingredients of successful experimentation:
Ask: Craft hypotheses based on what you’re trying to learn and the outcome measures that will determine success.
Make: Design experiments based on your hypothesis and what you’re trying to learn.
Learn: Analyse results and insights from the experiment based on primary KPIs and secondary diagnostics.
Adapt: Strategically and creatively determine how the learnings will be implemented and scaled.
Implemented through a process of discovery, setup, execute and scale.
But why are experiments still so infrequently conducted or don’t lead to meaningful change when they are? The answer often lies in the social and organisational aspects of planning and possible impact.
Breaking Down Barriers and Embracing Change
Within today’s media landscape, an effective measurement journey can be challenging to existing organisational structures.
Automation has broken down barriers between brand and performance, and online and offline outcomes. User journeys are not linear and measurement journeys must be built with this, and what is technically possible, in mind. This is opposed to being built around the historic responsibilities and KPIs of siloed teams, which is often the case.
Most organisations and marketing teams have not yet caught up with the pace of change in the media landscape. It is still common to have offline media teams, responsible for driving offline business goals (i.e., footfall) as well as ecommerce teams who are responsible for “digital” media tied to digital KPIs.
Simply focusing on the technical side of experimentation is made insufficient by these structures. Any project should achieve a shared understanding of the goals and roles amongst all stakeholders, which is made more challenging if these stakeholders exist in different teams with unique agendas.
As an obvious example, let’s take social video and TV. Traditionally, those responsible for managing these media channels, their objectives, and budgets, would sit in very different teams. Social would be the domain of “digital” teams and TV sat within “media”. The former would often be utilised predominantly for performance or brand objectives in silo, with native measurement solutions (i.e., brand lift studies). TV would be used to deliver reach and long-term brand building objectives.
Effective measurement journeys for today’s media landscape
Measurement solutions like Nielsen TAR offer the ability to measure the combined reach and impact of these media channels, which could fundamentally change the approach to investment. But where’s the incentive to explore this, or even buy-in to the potential results, if key stakeholders also need to protect their own interests (budgets and their own siloed KPI’s).
A test and learn culture can be developed by making small refinements to current practices and focusing on the people who will enact change – this is a minimum requirement. But, in the end, breaking down outdated or arbitrary barriers can provide greater, transformational business outcomes and allow the data to do the talking.
The full white paper for the research conducted in partnership between the7stars and Meta can be downloaded here.