Citizens Advice working together with their statutory funders to make monitoring and evaluation of advice more rational and more useful
HIP is a follow up programme from Measuring Outcomes from Citizens Advice. The work was initiated by Parkhead CAB and was funded by Citizens Advice Scotland Development Committee. Work took place between May 2016 and April 2017 and involved a learning set who met 6 times to explore how we could develop a more proportionate and useful monitoring and reporting system for bureaux (CABx).
The learning set involved representatives from CABx, Citizens Advice Scotland, Scottish Legal Aid Board, Scottish Government and The Improvement Service.
The Indicators for Measuring Citizens Advice pack suggests definitions and a small set of core indicators for measuring:
- Clients: the number and client profile
- Activities: the number of contacts, advice issues and type of support given
- Outcomes: 3 types of outcome (advice service, advice and client outcomes)
We hope that the pack
- provides a more rational set of indicators that CABx can offer to funders
- encourages funders to adopt the same set of core indicators for reporting.
This doesn’t stop CAB collecting more information for other purposes (e.g. social policy analysis) or project funders asking for additional information or for short term in-depth pieces of evaluation. But we suggest that projects and funders be clear about why you need additional information, how long for and the way it will be used.
Benefits of taking this approach – less is more
- There will be less double recording (because the same indicators work for different funders)
- There will be more time to spend evaluating outcomes
- CABx stop collecting data that they and others don’t use or is less useful
- Less data equals better quality
- Using the same definitions and indicators, means funders can better understand the overall reach and impact of their funding (and can compare like with like).
Next steps
This is a step along the path to meaningful measurement. Further steps include:
- Sharing with more people, specifically CABx staff, other advice agencies and funders (local and national) and local authority staff
- Testing out the client outcomes and indicators with advice providers to check we have the right focus, indicators and method?
- Have discussions with a broad group of stakeholders about how these indicators can be used in practice, both in terms of collecting information (timing and methods) and in agreeing reporting requirements
- Set up systems so that core indicators can be reviewed on a regular basis. These should not be set in stone for all time, to keep them relevant and useful we need to keep checking that data collection is adding value.
Background
Between June 2014 and March 2015 ESS ran a learning set of CAB practitioners aiming to:
- Explain: developing a collective model of the sector (in this case the advice sector) and showing links to national and local outcomes
- Measure: developing methods to collect information about outcomes
- Prove: bringing together research and practice based evidence to test models of provision (policy and practice)
As a result we developed the Measuring Outcomes from Citizens Advice pack.
This pack was shared with bureaux through regional days in September and launched at a conference for over 250 people, organised by The Improvement Service, Scottish Government and Scottish Legal Aid Board and CAS/ CABx in October and funded by the Scottish Government.
During this process we identified
- That different funders require bureaux to measure clients, cases and advice given in different ways leading to multiple recording of some data for different funders.
- Client profiling and activity reporting is very time consuming, leaving less time for evaluating outcomes or using data to improve services.
- It would be helpful to identify what data is most essential to bureaux, funders and commissioners for accountability and to improve service provision and reach.
- There seems to be some discrepancies between advice agencies in how they define and count a case, record the level of work and results.
- This may be making it difficult for funders and commissioners to understand the overall reach and impact of their funding and to compare like with like.
At the conference there was support across all the sectors for reviewing systems of measurement to ensure greater consistency across reporting requirements, to make better use of qualitative data and to put a greater emphasis on soft outcomes.
This work fits well with other initiatives to improve evaluation and performance monitoring, for example:
Debt advice evaluation framework: The Money Advice Service
https://www.moneyadviceservice.org.uk/en/tools/debt-advice-evaluation-toolkit-registration)
Scottish Legal Aid Board and the Improvement Service are considering the outcomes they want advice services to report on.
http://www.slab.org.uk/about-us/what-we-do/policyanddevelopmentoverview/Planningandcoordination/