The AMEC Summit, held for the first time in London last month, sets in motion an approach with wider implications than just evaluating the impact of communications.
by PANEL WRITER Neil Wholey
I often wonder why there is so much focus on the evaluation of communication. Compared to other professional sectors I work with I can think of no other so obsessed with stressing how important evaluation is.
Evaluation is a very simple concept. Prove causality. Did the thing you said would happen actually happen and was it a result of what you did? I sense we talk so much about evaluation because to those outside the profession, in particular those in charge of our organisations, lots of communications metrics are very visible. Leave it to others to evaluate based on what they see and the focus can be on visible volume measures of how many leaflets you’ve produced, or the column inches you’ve gained in the newspapers they happen to read.
It doesn’t seem difficult to achieve high scores on these metrics, directly proportional to budget, and assume a causal impact. But communication professionals know these kinds of metrics are weak in providing causality, and the behaviours they encourage are hardly why they chose communications as a career. We know that evaluation can be used to improve the quality of our activities and deliver a more substantial impact for the organisation.
The annual high point of the communications evaluation debate is the AMEC Summit, which was held last month for the first time in London. Over 300 delegates from across the world met to discuss the Summit theme of “Making Metrics Matter – Taking Measurement Mainstream”.
AMEC are the International Association for Measurement and Evaluation of Communications. When their Summit was in Barcelona they launched a set of seven principles which underpin the basic structure of good communications evaluation. In London they launched a new integrated evaluation tool which builds on these principles by providing a free interactive tool and supporting materials to help plan communications activity.
The beauty for those of us in the UK public sector is that AMEC works closely with the Government Communications Service (GCS) and the GCS evaluation framework and guidance launched last year uses exactly the same principles as the AMEC one.
We therefore have a free set of tools and ideas to help us evaluate communications in a structured and logical way. Those who implement the framework best will be those who use it to engage their internal audiences to change their behaviour, rather than just drop a few metrics into a spreadsheet. It is a framework designed to allow individuals to be involved in the evaluation of their own activities and to improve their work by thinking through the evaluation.
There is also another reason I would encourage you to use this framework. Although it is designed as a communications evaluation tool the foundation is a logic model. This is the basis of programme/policy evaluation and business cases including the guidance within the Magenta Book issued by HM Treasury in the UK.
In other words through an international desire to cut through the noise of volume measures of press releases and column inches, we have ended up with the core components of a HM Treasury business case. The launch of the new AMEC integrated evaluation tool is therefore not the end game in taking the measurement of communications mainstream. It opens the door to the evaluation of communications, and the tools and guidance developed by AMEC and GCS, being the basis of how we practically evaluate all the work of an organisation.
In itself the framework has taken a complex subject and made it understandable and practical to the audiences that matter. Communications evaluation has the potential to move from being a defensive reactive against the tyranny of measuring column inches to being at the heart of driving and evidencing change across whole organisations.
Neil Wholey is Head of Evaluation and Performance at Westminster City Council, Director of Insight at Westco and a member of the GCS Evaluation Council
image via Tyne and Wear Archive and Museums