The Essex website uses cookies. By continuing to browse the site you are consenting to their use. Please visit our cookie policy to find out which cookies we use and why.
View cookie policy.
Evaluation can improve impact of community programmes
Evaluation is often forgotten or excused in looking at the performance of social programmes. The response – it’s ‘too difficult to measure’. But learning together, through evaluation, can improve the delivery of programme benefits to all those involved.
Below, Dr Gina Yannitell Reinhardt shares her experience of embedding evaluation into social programmes. This involves engagement with a broad range of partners – from public service delivery organisations, through to policy makers, and finally to the public themselves. In this blog we take a different angle on public engagement – that it’s about learning together with active participants in the evaluation process.
A lot of public funding in the UK goes to social programmes – projects and initiatives designed to help improve people’s lives. You may have heard of some of these initiatives, like the Campaign to End Loneliness or Active Lives. These always sound like great programmes, designed to tackle important issues, particularly among the most vulnerable people in the country.
Are these programmes worth the money we spend? That is a tough question to answer, because the intended impacts of these programmes – reduced loneliness, increased resilience, improved wellbeing – are not easy to measure. But that doesn’t mean we shouldn’t try.
ARISE, Advancing Resilience and Innovation for a Sustainable Environment, is an initiative I founded to help build resilience among vulnerable people. ARISE is dedicated to the idea that evaluating the impact of public programmes is key to informing programme and policy design moving forward. How does it happen? By evaluating public programme impact, and by helping to build the capacity of local authorities to do the same.
How do we begin?
The first thing I always ask people is: What impact is this programme intended to have? In other words, imagine this programme is nearly completed, and you have to decide whether or not it should continue or be extended. What would convince you that this programme was a success?
When it comes to projects designed to have social impact, answering this question can be a challenge. What does ‘reducing loneliness’ look like? How would we measure loneliness to begin with?
Luckily, there are scientists who have spent decades researching how to measure elusive concepts like loneliness, happiness, trust, and wellbeing. So we don’t need to come up with new ways to measure something – we just need to find out how the experts do it. Then we can use their techniques for our own evaluations.
Will the evaluation be worth it?
One of the biggest objections to evaluating public programme impact is that the evaluation itself takes up resources that could be used to deliver the project. This is true – any time or money spent on evaluation is time or money that could be spent elsewhere. But it’s important to realise that evaluation can help offset programme cost, and increase programme impact, both within and outside of the organisation itself.
How is that possible? Well, for one thing, evaluations help staff and frontline service personnel understand how they are being assessed. Staff get to see the impact of their work, and the reasons for their managers’ decisions.
Managers, meanwhile, can use evaluations to refine programme delivery and inform future decisions. Projects become more efficient as they’re being delivered, and have greater impact in the community. Frontline workers then deliver the services with improved personal morale and wellbeing, and the services are better.
Essex County Fire and Rescue: A Case of Prevention
ARISE works with local authorities throughout the UK. Essex County Fire and Rescue Services (ECFRS) initially asked ARISE to help with evaluating its Parish Safety Volunteer Pilot, and then with helping assess a new home safety programme moving forward.
"I met Gina at a public lecture she gave about the importance of evaluation in public service delivery, and was both delighted and impressed with her approach. Her expertise on the importance and value of evaluation on public service delivery is impressive, and how, if not done or done properly, policy makers make the wrong decisions with lasting social impacts. The ARISE team’s work has made it possible to consider how collaborative partnership work between police and fire nationally could look."
- Andrea MacAlister, Head of Community Safety at ECFRS
Working together, ARISE and ECFRS developed the Five C’s, which guide the evaluation philosophy at ECFRS:
Core: Make sure evaluation is embedded into the core of service delivery; do not save evaluation for the end of a project.
Consistent: Evaluation cannot happen only once a year. Evaluators, decision makers, managers, and frontline workers should regularly share and discuss evaluation results and be aware of when to expect evaluation updates.
Clear: Evaluation results should be presented clearly and simply. Allocate time to discuss and explain evaluation results.
Caring: Evaluation results should not create anxiety. Share evaluation results with anyone whose work is being evaluated before wider release.
Constructive: Use evaluation to help craft and support a positive story of development and improvement. Allocate time to reflect and refine based on evaluation results.
More and more these days, I meet people concerned with how public money is spent. What are taxes buying? What value are we getting? What is the ‘right’ way to spend public funds? If we don’t evaluation programme impact, we will never know the answers to these questions. Luckily, evaluation isn’t as bad as it sounds. Contact ARISE by emailing evaluation@essex.ac.uk if you’d like to know more.