Skip to content
Article Article April 15th, 2015
Technology

Big data in government: making numbers count

Article highlights


It’s not that governments are short of data – the question is how to use it to best effect

Share article

The focus now should be to embed analytics into strategic and frontline decision-making

Share article

The design of any new big data system should be driven by its objectives

Share article

Partnering for Learning

We put our vision for government into practice through learning partner projects that align with our values and help reimagine government so that it works for everyone.

Partner with us

There is a room, three levels beneath the east wing of the White House, where outsiders seldom enter. A world away from the formal reception areas, the space - its official title is the Presidential Emergency Operations Center - is the bunker where then Vice-President Cheney and his senior aides retreated on the morning of 11 September 2001.

With President Bush racing west on Air Force One from Florida to first Louisiana and then Nebraska, the room became the nerve centre for the initial response to the attacks. Orders such as grounding 4,000 planes in American airspace and evacuating congressional leaders ricocheted thick and fast.

What citizens and the media didn't know at the time was that many of these decisions were taken without the support of accurate, up-to-date data and bespoke technology. Telephone systems and video conferences - both in the bunker and 45,000 feet up - proved woefully unreliable. It also later transpired that the terror attacks themselves could have been prevented if the US government had benefited from better data aggregation and analysis. The information existed but it wasn't being shared.

The US, though, has learned from its mistakes. Fast-forward 14 years and the Department of Homeland Security - set up in the aftermath of the attacks - has overall responsibility for keeping America safe. The result of a merger of 22 individual agencies, it is the hub for the timely aggregation, analysis and dissemination of information that is critical to national security and economic stability.

The US also revamped the operations of the Executive Office of the President (EOP) after the attacks. The EOP is made up of many entities, including the Office of the Director of National Intelligence, the National Security Council, and the Council of Economic Advisors. Working side by side, these entitles now benefit from clearer, more transparent information collection and analysis that, in turn, leads to stronger policy options and insights for current and future occupants of the Oval Office.

Decisions, decisions …

Although 9/11 is, of course, an extreme example, policymakers have little respite from the constant demand to make decisions that potentially affect millions of people. They are a recurring feature of life in power. Where to spend scarce funds? Which reforms will deliver the strongest public impact? Which new technology can best enhance existing processes? The answers are rarely straightforward.

This ever-shifting terrain demands not only the best and brightest to be attracted into public service - not always simple given the higher salaries usually on offer in the private sector - but also for government to be able to rely on state-of-the-art systems and data that provide policymakers with all the information they require to guide their decision-making processes. It's not that governments are short of data - far from it. On a daily, even hourly, basis its programmes and services result in, and need, huge amounts of data and other information. The question is how to capture and use it to best effect.

A question of design

An important starting point is for governments to remember that the design of any new big data system should be driven by its objectives. Technology-driven efforts often fail. Instead, policymakers should be guided by the vision of achieving better and faster decisions through data and information analysis, insights and policy support. In practice, this means garnering data on a wide range of topics from a diverse range of sources, which is then acquired and analysed in a central location by a team of advisors. It is they who should then make policy recommendations to the prime minister, president or ministries. So, how can they get there?

The pace of technological change means that governments can now develop a high-level IT architecture that is able to reflect the need to be both scalable and flexible while remaining good value for money. One-off systems are now things of the past. Instead, there is a greater ability to aggregate data from various sources, thereby removing any need to rely on potentially outdated IT systems. There is also no need to build a supercomputer for these tasks, since the requirements and queries will change over time.

Another priority for departments is to develop analytics capabilities that can make use of the huge amount of data that it produces and receives. Organisations can get results even from the analytics of unstructured data by using self-learning systems that improve the result of the analysis over time. Unfortunately, many public sector organisations find themselves doing ad hoc analysis on specific issues and are reliant on poor-quality data that has not been integrated and is possibly out of date. The focus now should be to step up and embed analytics into strategic and frontline decision-making, backed up by strong executive-level support.

This cultural change places a huge burden on government recruiters, however, as it is difficult to find these skills and continue to develop them over time. For example, the UK over the next five years will need 250,000 data scientists and analysts while the estimate is that only 20% of this number will be available - a similar picture exists across the western world.

But it's not all bad news. The UK may be short of key staff but its Government Digital Service (GDS), which implements the government's ‘digital by default' strategy, is also seen as a leading pioneer. As part of its efforts, it produces a blog about its progress and openly reports data on service costs and volumes. The GDS publicly tracks its progress against targets and publishes the source code for other governments to use. Talk about transparent.

Such examples show that governments are increasingly grasping the potential offered by big data and analytics - and so they should. After all, it offers a huge opportunity for policymakers to optimise policy, programme, and service design. Now that's making public impact.

 

FURTHER READING

  • Power to the people. Few countries have embraced the digital era as successfully as New Zealand. We talk to one of its government's key digital transformation leaders, Richard Foy, about how they've done it.
  • Computer says yes. Governments are increasingly reliant on digital technology to deliver public services - and Australia's myGov service is a potential game-changer, says Gary Sterrenberg
  • Online, on track? Miguel Carrasco looks at how policymakers can improve the delivery of digital services
  • Digital dawn. It may not be obvious, but US policymakers have had an important role to play in the creation of today's digital era, says David Dean
  • Core workouts. With governments increasingly seeking to transform their core systems, Andrew Arcuri explains how they can move from achieving good results, to truly great

Share this article: