Skip to content
Article Article June 26th, 2020
Education • Legitimacy • Innovation • Delivery

Measurement for Learning: Methods & Tools that Enable Improvement

Article highlights


In the final #MeasurementforLearning blog, @JohnBurgoyne07 describes methods & tools that can enable learning & improvement

Share article

Learning pods, values & principles check-in & storytelling evaluation methodology are some great #MeasurementforLearning methods & tools

Share article

How can #MeasurementforLearning stick? This series from @JohnBurgoyne07, and our upcoming discussion paper w/ @nesta_uk, finds out

Share article

Partnering for Learning

We put our vision for government into practice through learning partner projects that align with our values and help reimagine government so that it works for everyone.

Partner with us

We are a working group of ten pioneering local authorities focused on meaningful measurement, as part of the Upstream Collaborative, an active learning network of Local Government innovators supported by Nesta. This blog series sets forth a different measurement approach for public services centred on learning, and will culminate in a discussion paper in the Autumn. This approach has been shaped by our practical insights and experiences working in local government.

We are from the following local authorities: Barking & Dagenham, Derbyshire, Gateshead, Huntingdonshire, Kirklees, Leeds, Greater Manchester, Oxford, Redcar & Cleveland, and York. John Burgoyne, from the Centre for Public Impact (CPI), serves as a listener and facilitator of the working group, and has gained our collective input to shape this blog series and the forthcoming discussion paper.

In this final post of our blog series, we will describe some of the methods and tools that can enable measurement for learning. As a working group, we have explored measurement methods that broadly fall into two categories:

  • Internal: how we learn as a team, an organisation, or a local government more broadly

  • External: how we learn with others outside of local government (e.g., residents, community partners)

How we learn as a team

We would like to share two measurement methods that better enable learning and improvement within local governments. An important theme across these methods is that they are most effective when a team has a strong culture anchored around a set of values and principles that support openness, collaboration, and learning. If, for example, frontline staff do not feel trusted to share their authentic perspective, including what they feel is not working, then it will be difficult for the broader team to learn from their experiences.

When senior leaders in the field embrace these types of measurement methods, they are signaling that they aim to cultivate cultures that support learning and improvement. When staff feel that what they share will not be used for accountability (i.e., they will not be punished or rewarded), they are more likely to be open and constructive about their experiences. A baseline level of psychological safety enables staff to speak candidly about both successes and failures they observe, leading to learning about how to do better.

Learning Pods

Learning pods are one of the internal measurement methods where this psychological safety is critical. This method, inspired by Chris Bolton's viral blog post on deploying learning and innovation teams in response to COVID-19, pairs staff who interact directly with residents to reflect on what they have experienced and learned over the past week.

Instead of predetermining what they will report on, learning pods use a set of open-ended questions so the staff participating can share what has emerged in a dynamic, adaptive way. Questions include:

  • What have you done differently this week?

  • What did you learn?

  • What enabled that learning?

  • What has gone wrong?

Knowing they will not be punished for what they say, staff feel safe opening up about what they think could have gone better, and those insights are used to inform decision making about how to adapt and improve moving forward. After reflecting on the questions with a learning partner, staff come together for a group discussion to understand perspectives across different pods. In addition to enabling learning, the pods build empathy as staff are exposed to a wide range of perspectives.

Becky Willis, Project Manager at Oxford City Council, who is trialling this approach has reflected:

Meaningful measurement could be a big turning point for our council.

Values & Principles Check-In

Another way to enable learning and improvement is to explicitly check-in on the values and principles underpinning your approach to measurement. Our working group has recommended a set of values - trust, authenticity, and curiosity - and principles that flow out of those, but we encourage you to develop your own, which you can embody and hold yourself accountable for.

This method draws inspiration from Confirmation Practices (see a video summarising what they are here), simple routines for systematic reflective practice developed by Andy Brogan at Easier Inc. Our adapted version, which you can find here, simply involves bringing team members together on a regular basis (e.g., every month) to reflect on the following questions for each principle you have defined:

  • How well are we adhering to this principle from 1 (not very well) to 4 (very well)? Don't overthink your response, just go with your gut reaction.

  • Why do we deserve that rating?

  • What action can we take (if any)?

Similar to the learning pods, the purpose is not to punish the team if ratings are low, but instead to genuinely understand how everyone feels the team is doing so that you can adapt and improve. This approach enables you to reflect on how you are (or aren't) living out your values as a learning organisation. As Hannah Elliott, Transformation Lead at Kirklees and working group member, has described:

It is important to articulate how you are a learning organisation and what this looks like at the frontline and strategic leadership level.

How we learn with others outside local government

Learning externally is just as important as learning internally, and requires a different set of methods that draw on a similar set of values and principles. The same trust, authenticity, and curiosity you demonstrate among team members should be extended to people who live in the communities you serve.

Across any measurement tool or method you use to learn about residents and the complex problems they face, it is important to understand historical context and create the conditions for people to feel safe and comfortable opening up and sharing. For example, many in local government are currently trying to better understand the experiences and challenges Black communities face so they can improve their services to better meet their needs. To hear and seek out this truth, it is critical that councils understand how racism has shaped and continues to shape Black communities' relationship with government, and what this means for how we enable greater autonomy and ownership for people sharing stories and insights.

Storytelling evaluation methodology

One method that gives participants ownership is the storytelling evaluation methodology developed by Arts at the Old Fire Station (AOFS). Instead of predefining outcomes to measure against, the storytelling evaluation methodology lets people identify what outcomes matter to them and decide how they want to talk about them. Unlike traditional attempts to evaluate impact, this method does a better job of meaningfully capturing the complexity of individuals' lives and is actually a fun, enjoyable process for all involved.

Sarah Cassidy, the Inclusion Manager at AOFS, says that previous attempts at measuring impact led to asking certain questions of people that actually get in the way of the work and damage the relationship between those evaluating and those being evaluated. Alternatively, the creative storytelling method offers the interconnected benefits of:

  • Creating the conditions for psychological safety among “storytellers” and”story collectors”

  • Unlocking new, generative insights that would not have been shared without that feeling of safety

  • Developing good, quality relationships

Sarah and the team at AOFS are working with Oxford Hub to collect and analyse the stories of people involved in Oxford's community response to Covid-19, which will be shared later this year. Sara Fernandez, who leads Oxford Hub and is a working group member, reflects on the value of this method: “you cannot really measure the quality of relationship in the community unless you are capturing stories.”

To bring stories into your measurement work, remember an important point from Alexis Pala, who practices meaningful measurement and innovation at Y Lab, shared with inspiration from Dr Emma Blomkamp:

Every story has a number and every number has a story.

Next time you are evaluating quantitative data, it is worth considering how to capture the story behind the numbers.

Life Journey Maps

In Huntingdonshire City Council, Oliver Morley, Corporate Director (People) and one of our working group members, has been working with Claudia Deeth, Community Protection and Enforcement Officer and a range of partners from across the system, to create Life Journey Maps.

To create the maps, the council and partners start by providing “meaningful conversations” training (based on the Making Every Contact Count training commonly used in Health) to staff, designed to help them have conversations with residents that get at the root cause of an issue, listen for keywords, and deeply understand residents' perspectives and where they could use help. Trained staff then proactively reach out to residents (e.g., those who have missed a council tax payment) to have a meaningful conversation and identify key moments in residents' lives, their perception of these events, and impacts it had on them.

From these conversations, staff create a visual Life Journey Map that plots out life events, and quantifies the level of risk associated with each event using tools such as the Holmes-Rahe Stress score, which weights the stress level of a life event. By adding together the risk levels of each event, we can calculate a total “risk” score of an individual. As this metaphor illustrates, the more traumatic events an individual has experienced, the more likely they are to be susceptible to a crisis (e.g., homelessness, mental illness, arrest). Each event is also associated with the cost to various systems within local government (e.g., police, social care, education).

See a detailed version of the map here.

The Life Journey Map combines quantitative data points with qualitative insights to provide a holistic view of key points in a person's life. The map serves multiple purposes, enabling:

  • Preventative intervention: by meaningfully engaging with residents and understanding stressful experiences that may trigger crises, we can provide tailored and early support to prevent future harm. As Oliver describes:

    Bad things do happen, but most of the time the signs are there. Quantifying, preventing, and addressing risks reduces poor outcomes.

  • Learning from failure: the life journey map strengthens our understanding of the complex series of life events that contributed to a crisis in someone's life. By examining how these interrelated actions compounded, we can see where we could have intervened earlier and adapt how we handle future situations based on what we learn. For example, if we see that a person who experiences domestic abuse as a child is less likely to experience mental illness later on if they have protective factors (e.g., social connections, community support), we can design our systems in the future to provide more protective factors as soon as a trigger like domestic abuse is identified.

  • Systems change: A Life Journey Map illuminates different life events that siloed actors may not have seen otherwise. For example, if an individual applied for social care, but did not meet the criteria to receive support, the Life Journey Map enables us to consider other forms of support that individual may need. Instead of treating it as a transaction where the person is turned away, a more holistic view of a person's needs allows us to treat it as a relationship and refer them to other services within the system, besides formal care, that may be of use.

We are interested in building this tool out and testing it with other local authorities across the UK. If you would like to be involved in this effort, get in touch.

How to make measurement for learning stick

These measurement methods and tools are part of a bigger movement we are seeing in local governments across the UK. In response to COVID-19, many of the traditional forms of measuring impact - people in positions of power setting and managing targets for others to hit - suddenly disappeared overnight, including charitable foundations suspending performance targets, Care Quality Commission and Ofsted suspending their inspection regimes, and NHS England turning off all the financial targets for the year.

With this top-down approach to measurement receding, our working group is proposing an alternative approach to measurement focused on learning. Measurement for learning does not mean removing the metrics and accountability regulatory bodies provide, but rather complementing quantitative data with qualitative insights and giving those closest to the problems the power to define what they want government to be accountable for. We believe this approach can fundamentally change how governments approach their work, enabling them to better cultivate relationships with residents and ultimately improve their services.

To make the case for why we need this approach and how it is already coming to life, we have published this blog series and will be publishing a discussion paper that expands on these ideas and shares additional examples and methods in the Autumn. Follow along on Twitter #MeasurementforLearning. We are excited to see how our thinking can contribute to growing the conversations and changes taking place at other levels of the system.

We hope these methods and tools can serve as helpful inspiration for local government across the UK and indeed across the world. If you are interested in adapting them or trialling something similar in your own local government, do let us know.

The Measurement for Learning report will be part of a series of learning products exploring the new operating models emerging in local government - how they work, what they look like and the key features needed to replicate success elsewhere. It draws on the experience of the twenty pioneering Local Authorities participating in the Upstream Collaborative, which has been led by Nesta in partnership with Collaborate CIC. The full package of reports will launch in September 2020.

#MeasurementforLearning

https://www.centreforpublicimpact.org/assets/william-warby-WahfNoqbYnM-unsplash-1612273426.jpg

Measurement for Learning Series

This blog series sets forth a different measurement approach for public services centred on learning. This approach has been shaped by our practical insights and experiences working in local government.

Explore the series

Written by:

John Burgoyne Former Program Manager, North America
View biography
Share this article: