Skip to content
Article Article July 11th, 2019
Legitimacy • Innovation

“The simple answers are wrong:” Toby Lowe on the need for a new kind of accountability in public services

Article highlights


"If we can’t trust the data in the system then the system itself becomes a lie" @tobyjlowe on #complexity w/ @ElenaBagnera & @MargotGagliani

Share article

The #FutureGovernment must be designed for #complexity, according to @tobyjlowe. No simple answers here!

Share article

In complex public services we must value relationships, learning over measurement & collaborating across systems- @tobyjlowe @CollaborateCIC

Share article

Partnering for Learning

We put our vision for government into practice through learning partner projects that align with our values and help reimagine government so that it works for everyone.

Partner with us

Toby spoke to Elena and Margot about his work on complexity research and the changes we need to make in order to allow for real, meaningful accountability in public service delivery. As someone with experience of providing services on the ground, Toby was once asked to supply data that he felt was untrue. It was an experience that set him on a new career path and led ultimately to his current research post. In the discussion, there was particular interest in how his views fit into CPI's work on the future of government ⁠— here is a summary of their conversation.

What led you to research complexity in public service delivery?

Years ago I used to run an arts organisation in Newcastle. When the local council adopted a new outcomes-based accountability approach, I found myself being asked to make a claim about our results that simply wasn't true. We worked for a few hours a week with young offenders and I was asked to say that this had a clear and measurable effect on their rates of re-offending. In reality it was impossible to quantify the effect we had. The service commissioner understood this, yet he needed me to lie so he could tick the box that would allow him to give us funding for another year.

That was my wake-up call. The system was treating complex situations as if they were simple. The factors that can lead to young people to re-offend are varied and messy. They may be connected to housing, employment, addiction, to name just a few. The data was being used in a way that was reductive and ultimately false.

If we can't trust the data in the system, then the system itself becomes a lie ⁠— and until we change this, it will always be difficult to create positive outcomes or even to speak openly about what we can achieve.

My subsequent work at the Newcastle Business School all grew out of this realisation.

What were your research findings?

As I began to research the use of outcomes for performance management, it became clear that this form of performance measurement actually encourages people to game the system. At best, they focus on the numbers that allow them to tick the box they are responsible for, with the result that it becomes harder to work across services or to take into account the variable and interrelated factors that affect outcomes on the ground. Many studies provide evidence of this.

The current system, based on the New Public Management approach, also offers an overly simplistic way of assessing the work of people who implement public policy by performance-measuring them against purely quantitative targets. It assumes they need external motivation and should be rewarded or penalised accordingly. Public servants are held to account for hitting targets they almost certainly, in reality, don't control. This only serves to encourage the manipulation of data ⁠— further supporting the lie of simple outcomes in a complex world.

All this was a revelation to me, but as I began to present my findings to practitioners I didn't get the push-back I was expecting. People already knew the system didn't work, but they couldn't see what a viable alternative looked like.

Why is it so hard to let go of this type of performance management in public policy?

Once you've got a system that produces good-looking data, even if the data is meaningless it can become addictive. It also creates the myth of control, whereby politicians can make a pledge to, say, reduce crime by 10%. It's comforting to use numbers in this way and has become an accepted part of the dominant political discourse.

Partly it's human nature. We'd all feel more comfortable in a simple world where progress could be tracked in a linear way.

But everyone knows that's not the reality: the world is complex and almost everything we want to change is interconnected, overlapping and multi-factorial.

So I began to wonder, what if we abandoned the pretence that the world is simple and developed a policy approach that embraces complexity, with all the uncertainty and vulnerability this implies?

What does this mean for accountability?

True accountability is not about counting but asking people to give an account of their actions as part of a dialogue in which they explain the decisions they have taken in the specific context they are working in. It's also not just about the traditional hierarchical relationship. There are multiple accountable relationships. Your peers could ask you to account for your decisions, as could a member of the public who is receiving the service - or an ombudsman or professional body. The main thing is that real accountability involves a conversation.

We need to think differently about measurement. We are very exercised by the question “What should we measure?” It's important, but it's secondary to the question “Why are we measuring?” If we measure to make ourselves accountable to others (to ‘demonstrate our impact', for example) we become subject to Campbell's Law ⁠— the act of measurement corrupts the process it is intended to monitor. Instead, we should measure for learning ⁠— and use learning as the driver for performance improvement (rather than accountability). This is the appropriate reason to measure outcomes (or anything else). Measuring in a way that helps us to learn is partly about the effective capture of both quantitative and qualitative data and partly about deciding what to capture in any given context ⁠— ideally the data should be chosen by the people doing the work.

When I asked people how they respond to complex situations, the words that kept coming up were “relationships”, “trust” and “autonomy.” If you treat people as if they are intrinsically motivated to do the work, then the measurements become learning tools for those people; they can put the abstracted and simplified data they get from performance measurement alongside their own experience, qualitative feedback from other sources and the experience of their peers, and turn that into a set of learnings they can use to adapt and improve their practice.

What would an alternative approach to public services look like?

In the report I co-authored with Collaborate, Exploring the new world: practical insights for funding, commissioning and managing in complexity, we describe a new approach that is emerging in local governments and funding bodies across the UK. We call this approach “Human, Learning, Systems.”

The human element is based on putting human relationships at the core of public service. It requires trust and communication at all levels: between those delivering the service and the people being served, and between commissioners and service deliverers. The latter because, when delivery is based on listening, it becomes a bespoke service which cannot be so easily performance managed ⁠— allowing people on the frontline a degree of autonomy is essential.

The learning element is about allowing learning to drive performance improvement.

Working effectively requires the ability to adapt and change in response to the dynamic nature of the environment, because the context which enables interventions to ‘work' is constantly changing, so our interventions need to constantly adapt and change. “What works” is the continuous process of listening, learning and adapting. This means that the role of strategic management is to enable the people doing the work to learn effectively and ⁠— through that process ⁠— to make better decisions in times of uncertainty.

The systems element means recognising that the outcomes we care about are not “delivered” by organisations but are the result of hundreds of factors ⁠— people, organisations, processes, cultures ⁠— interacting together in a system in unpredictable ways. If we want better outcomes, we need to help the people and organisations in these systems to collaborate and coordinate more effectively. In other words, healthy systems produce good outcomes.

How can you put this into practice?

None of this is easy. The learning element in particular requires a radical rethink. How within an organisation do you create safe spaces for learning and reflection, where people can talk openly about errors and uncertainty with their peers? If you're working in a complex environment ⁠— for example, with homeless people or young offenders who may need help from several different services in order to achieve a long-term positive outcome ⁠— errors and uncertainty are pretty much the only things you can guarantee. You will feel uncertain about what you're doing and some of your judgements will look like mistakes, because you will always be interacting with elements you can't control.

We recently worked with several organisations that deal with adults with complex needs. As an experiment, I worked with health and social organisations across North East England to help them create safe spaces for peer-to-peer performance management. They created a Learning Community to help build a positive error culture within their organisation and went on to write a manual explaining  how to do this approach.

When we revisited the NHS group and asked about the effect of the new reflective space on their practice, they explained that it had improved their ability to make decisions in situations of uncertainty. Before they had tended to avoid making a decision because they were wary of making the wrong one. Through this work they realised that often there is no such thing as ‘the right decision,' so they were more prepared to try something and learn from it through subsequent discussion with their peers.

What's next?

Gateshead and Plymouth councils are just two examples of local governments in the UK that are framing the delivery of public services, and the accountability for those services, in this new way. This way of thinking is a significant change from the norm and so every positive example we can cite is helpful in persuading people of the value of this approach.

It will not be an easy task to bring about change at the level of government departments, because we are challenging established thinking about how we achieve outcomes and ultimately about how we set up our political and institutional systems. But we are certainly beginning to see inroads being made, particularly by the people who deliver services on the ground, who are working in situations of greater complexity and more uncertainty than ever before.

***

If you are interested in finding out more about how to adopt a Human Learning Systems approach, Toby and his team are building a movement to make this approach more commonplace. If you'd like to be part of that, join the online Community of Practice to find resources, and connect with others who are adopting this way of working.

#FutureGovernment

#FutureGovernmentLeadersSeries

Share this article: