News & blog



Blog: Doing what works – Finding the right evaluation method for your work can be a revelation

Andrew Findlay talks about how concentrating on approaches which fitted with his organisation’s way of working and service user group has helped Interest Link Borders produce a torrent of rich evidence…

Interest Link Borders is a volunteer befriending organisation, creating and supporting 1:1 and group friendships for children, young people and adults with learning disabilities in the Scottish Borders. We have four branches which are centrally co-ordinated and support about 200 people each year. 

With funding from Paul Hamlyn Foundation we worked with ESS in 2013 on a new evaluation strategy.  Until then we had mainly relied on a three-yearly arms-length independent evaluation which used feedback quotes and hard figures on outcomes but was of necessity fairly dry and not at all visual. We were also finding that traditional baseline and follow-up forms were not providing consistent evidence and did not inspire much enthusiasm among participants or staff.  This was partly because our service users have moderate to severe learning disabilities and we work with them over a very long period: the baseline information was often random when compared to follow-up (and between follow-ups), and using it to try to track progress was not effective and became less relevant the longer we provided a service.

ESS facilitated workshops for staff and provided support over a period of time to me as Project Co-ordinator. The ESS advice was to become better at self-evaluation and concentrate on approaches which fitted with our way of working and our service user group.

The first fruit of this was an Impact Report in 2013-14 based on an organisation-wide survey asking service users, carers and volunteers to rate and describe the difference made to them by our service.  When presented as quotes, word clouds and charts and combined with case studies and lots of photos, the result was a highly visual and accessible snapshot of the service.

The next step was to let our four local branches experiment with whatever evaluation methods they thought would work best. The result was a revelation: staff now had ownership of evaluation and produced a torrent of rich and diverse material to show off different aspects of their work: videos, simple photoreports, detailed magazine-style reports, case studies, scrapbooks, evaluation workshops and animations. Participants enjoy the process and like seeing the results, and funders supporting very specific parts of our work are delighted to get material that is closely relevant and engaging.  Taken together they soon started to form a really broad evidence base, all of which could be displayed on our website.

After three years, local evidencing and evaluation are now a natural part of branch activities and recent material is being drawn together for the three-yearly organisation-wide Impact Report, combined with hard outcome figures drawn from another central survey and comparable to the figures in the 2010 and 2013 reports.  This hopefully strikes a balance between formally assessing the quality and effectiveness of our service over a period of time and giving a visually attractive demonstration of the range and variety of our work.