A recent comms2point0 Twitter Poll showed that many teams are not evaluating and reporting on their work regularly. So what does good look like? This new case study from central government gives a useful insight.
by Damien Currie
A few years ago, one of the first things I used to do in January was put together a performance dashboard. It was supposed to show what we were actually achieving with our communication activities.
But it didn’t. The graphs and numbers looked impressive but they didn’t actually capture anything meaningful. No outcomes, just volumes.
Today, I can tell our board that the businesses we want to influence are using our content to educate and train their staff to comply with road safety rules.
We also know they’ve made decisions to invest financially in safer practices, such as buying new or better vehicle testing equipment, because of our campaigns.
And our activities are responsive to the issues businesses are facing at the present time, not just subjects which are strategically important to us.
So how did we get here?
Our evaluation journey begins well before those performance dashboards came to life. But they are a part of the journey. Without them, we might not be where we are now.
Working in government communications, I’m lucky to be part of the Government Communication Service (GCS).
Over the years, I’ve attended many events and comms exchanges with other government communicators. The head of our profession, Alex Aiken, regularly speaks at these events all over the country.
As impressive a public speaker as Alex is, his ability to convey the value of our profession and the necessary investment we need to make as professionals is the most important thing I take away each time.
It was his call for all government communicators to be leaders that pushed me to challenge our board to make the necessary investment in communications: to transition away from the “spray and pray” method to targeted, controlled communications which speak directly with our audience.
Over the years, I’ve engaged with and sought buy in from corporate senior leaders to adopt the basic building blocks of government communications; mandatory evaluation, partnerships, digital engagement and low to no cost campaigns.
But without a formal budget, it was difficult to achieve fully evaluated communications.
That’s why the dashboards came about. We tried to go beyond media coverage by looking at conversion rates and topic values.
These were still just outputs and outtakes though.
A strategic way forward
In 2016, when our new strategic objectives were launched, they included a specific target to maximise the effectiveness of regulatory activities through communications.
How we delivered against that target would be included in an annual report to the Secretary of State – a document that’s also widely read by industry and key stakeholders.
This presented us with an opportunity to secure the investment needed to communicate directly with our audience and understand the effectiveness of those activities.
Having already briefed our board on the GCS Evaluation Framework, Modern Communications Operating Model (MCOM) and successive Government Communication Plans, I outlined how we would transition from proactive media engagement with intermediaries to a digital news service which validated and improved the behaviour of our target audience.
And those performance dashboards were part of the pitch – we had to do better than looking at how popular a news story was or whether proactive coverage was higher.
In fact, some partner research found that awareness of our road safety activities with our target audience had fallen between 2014 and 2016, despite a near 200% increase in proactive media engagement.
Embedding evaluation in every decision
With the necessary investment in this new direction, we now frame all of our decision making using evaluation, insight and research.
We use a digital communications platform with built in evaluation.
Our content is driven by:
1. what our audience tell us they want
2. how they think and feel about their businesses and towards compliance
3. the times they’re interacting with us
We harness our own data to identify trends and themes (e.g. call centre enquiries) for content development.
We leverage our relationships with government partners who share strategic aims to secure their expertise and resource in key areas which we can’t deliver internally (eg. survey design and launch).
But looking back on our last 12 months, one of the most critical lessons I’ve learnt is the value of engaging regularly with our user researcher.
This is live evaluation, at your fingertips.
Our user researcher speaks directly to our audience on a weekly basis. She understands their attitudes, behaviours, aspirations and challenges.
She listens to our audience. There’s an established level of trust; they’re engaged with a person rather than a set of survey questions. They’re open about their experiences; honest about the difficulties they’re facing.
I regularly talk to our user researcher about what we need to know and what we want to achieve. We work together to test the tools and tactics used to communicate with industry and improve road safety standards.
Her research helps to validate the success of our communication activities.
But it’s also given us some great insight, which we use to develop new content and campaigns.
Our next challenge is to communicate with those businesses who don’t want to hear from us. Who fail to comply with road safety rules. Who put the lives of other road users at risk.
To do this we’re using our evaluation tactics and tools to devise new propositions which motivate behaviour change. We’ll be testing the impact of these messages and working with our partners to measure the outcomes.
Damien Currie is communications manager at Traffic Commissioners for England, Scotland and Wales & Senior Traffic Commissioner for Great Britain
You can follow him on Twitter at @damien_currie
image via U.S National Archives