On the 21st and 22nd February I attended the BRICK Conference 2018. The conference was titled ‘Learning for the Future: How to evaluate the impact of heritage projects’ and across the two days delegates and speakers explored the importance of evaluation in heritage, what it entails and how to do it well, through a range of panel discussions, presentations and breakout sessions. The full outputs from the day are available here:
Evaluation is not new to me personally, so I was keener to attend in order to meet people working in heritage projects, and those planning to undertake one. In that respect, I learned a huge amount from different viewpoints. For the benefit of those unable to attend, and hopefully even for those that did, I want to share five key insights from that learning.
Plan evaluation from the start
Most people tend to see evaluation as something that happens at the end of a project. This couldn’t be further from the truth. For an evaluation to be effective planning has to happen at the beginning. As Ruth Gripper, from New Philanthropy Capital put it in her talk “Start by being clear about what you’re aiming to achieve.” The clearer your intended destination is at the start the more likely you are to reach it. Projects will change and what you wanted to do will rarely become what actually happened, but that doesn’t negate the crucial need to plan ahead by putting checks and balances in place.
Use a theory of change
Heritage projects are often run by people who may not have a background in evaluation and resources are always tight. A simple way to start a project off on the right track is to develop a theory of change or logic model. This doesn’t cost anything and only requires some time investment by the key stakeholders involved in the project. A number of speakers presented theories of change which were good, but very complicated, such as Ian Thomas’s logic model for Evaluating Cultural Heritage within an International Context. Don’t be put off, these complex looking examples aside, as long as you can gain consensus on how your inputs and activities will result in your desired outputs, then you are well on your way to understanding what your outcomes and ultimately impact could be.
Invest properly in evaluation
HLF’s new evaluation guidance (https://www.hlf.org.uk/evaluation-guidance) makes it clear we want future projects to invest even more than past projects have on their self-evaluations. The way we are supporting this is by increasing the amounts we allow projects to allocate in their costs to evaluation. If you don’t invest enough in your evaluation it won’t be useful. Think about all the data you need to collect, the monitoring you need to put in place, and translate this into realistic resources to support it. The funding landscape is becoming more competitive and many projects are securing multiple sources of funding adding further to the complexity. Therefore, make it easier to prove the impact of your project by investing realistically in demonstrating it.
Bring in different perspectives
When you mention the word evaluation the majority of people wince! Evaluation causes this reaction because it essentially means assessing the value of something, such as a project. People working on Heritage projects are emotionally invested in their work. I met many passionate people at the conference which inspired me. We all want to think the work we do is valuable and believe me it is. The crucial thing is to assess its merit beyond our own estimation of it. Doing this effectively requires honesty, which entails a high degree of objectivity. Most projects fall short in this regard because they don’t bring in enough different perspectives to validate or challenge their own perceptions. Matthew Mckeague, CEO of the Architectural Heritage Fund talked openly about what matters to his organisation. A key concern they have is evaluating their own performance. One way they are doing this is through their client impact survey, which assesses and rates the service they offer. This is a worthwhile pursuit as the more feedback both positive and negative from those impacted by our work we get, the better our future work will be.
Learning is continuous
Staying true to the conference theme the opening talk by Dr Ulrike Hotopp from the UK Evaluation Society emphasised how evaluating can lead to great learning. A hugely experienced economist and expert in impact assessment, Ulrike, gave plenty of powerful examples from her career about when she had used evaluation to facilitate learning. Only through learning can we hope to provide relevant solutions to problems.
Lessons tend to always emerge at the end of an evaluation in the final report, but often they are not meaningful because the learning which happens continuously throughout the duration of a project hasn’t been collected effectively. We end at the beginning, and that is what good evaluation is all about: a continuous and progressive body of learning for the benefit of heritage, people and communities.