I recently published a learning report which documents SHARE’s experience using outcome mapping to plan, monitor, and evaluate research uptake. This blog shares a summary of our experience for academic organisations, research consortiums, M&E specialists – and anyone else aiming to engage policymakers and practitioners with their research!
Why outcome mapping?
Many have observed that policy change takes place in a complex system over many years and involves multiple stakeholders who all use different influencing approaches. Pathways of change are difficult to predict in advance using linear representations of change such as logical frameworks. It’s also challenging to attribute change to any single influence and to define impact.
Outcome mapping (OM) is a participatory approach for planning, monitoring, and evaluation developed by the International Development Research Centre (IDRC). It focuses on identifying, engaging, and influencing key stakeholders to change their behaviour. Importantly, outcome mapping recognises that we contribute to change in collaboration with others and in relation to wider contextual shifts. In SHARE, we were particularly influenced by ODI’s Rapid Outcome Mapping Approach (ROMA), which is designed to engage with policy.
We have five implementing partners – four research institutions and one NGO – all of whom were undertaking OM for the first time. Each partner was responsible for designing their own research into use plan using outcome mapping. Partners approached the design process differently, according to what was appropriate for their context (i.e. with some inviting external stakeholders and others working internally).
I provided support on OM as a method, as well as hosting annual reflection workshops with each implementing partner to review progress, identify challenges, and make changes where necessary.
We learnt a lot from using outcome mapping, particularly through the challenges we faced. For example, we found that the process of developing OM documents was both time and human-resource intensive. This took longer than originally anticipated – between 3 and 9 months per partner!
It was positive that SHARE had brought on a full-time M&E specialist (myself!) to support this process and help build a strong foundation for using OM across the programme. It was also important that this process was owned by implementing partners, who continually increased their confidence around the approach.
Another key challenge for me was managing large quantities of monitoring data – I iteratively improved this process over time based on partner feedback. It was particularly useful that we could quantify and track overall programme progress through linking OM data to a logframe indicator – while still analysing rich qualitative data for depth and context.
Outcome mapping was particularly relevant for us because of its flexibility – it gave implementing partners space to react to opportunities and to change their approach if new opportunities or challenges emerged. This flexibility also helped us to capture unintended change.
For example our implementing partner in Kenya was asked to lead the newly established county policy and research group in Kisumu – this achievement led to further opportunities for influencing at county level. Our partner in Zambia ended up co-hosting a UK Parliamentary visit – this contributed towards stronger donor relationships which enabled them to leverage further research funding.
We hadn’t been able to predict these achievements in our initial plans, but adapted our plans to seize these opportunities – and were able to effectively monitor them through the outcome mapping approach. Other benefits of OM included improved communication and collaboration with stakeholders and a growing learning culture across the programme.
You can learn more about our experience including recommendations for others by reading the full report!
This blog was first published by Research to Action.