The pivot from ‘cloud first’ to ‘data first’ in the public sector

Russell Macdonald, CTO for public sector at Hewlett Packard Enterprise (HPE), explains why the next wave of digital transformation needs to be data-first modernisation.

Posted 6 December 2023 by Christine Horton

Data is everywhere and fundamental to our lives and society at large. While there was an initial rush to transfer all data to the cloud, that simply isn’t the case today. The recent HPE Public Sector Data Strategy Report shows that as much as 73 percent of public sector data continues to remain outside of public cloud.

At the same time, the volume and variety of data has been growing exponentially. The global datasphere is doubling roughly every four years, and most of this growth is not happening in data centres or clouds, but at the edge.

Therefore, even within one government department or public service, there are pockets of data distributed across a wide range of platforms and locations from edge to data centres to public clouds. In fact, 49 percent of respondents said that their data is buried in silos, and this is a significant barrier to achieving a data-first approach.

“The first wave of digital transformation in government focused on consolidating thousands of government websites into a single GOV.UK domain and getting to the cloud in order to save money, but in many cases missed the opportunity to evolve the way they were organised to deliver those services,” said Russell Macdonald, CTO for public sector, Hewlett Packard Enterprise (HPE).

Macdonald points out that 45 percent of respondents to the HPE research say that they have no data strategy at all, and of the remainder many of those say that data strategy is included in their IT strategy. This indicates that they view data as an ‘IT problem’ or a matter of ‘storage’ rather than for the value it holds for the organisation, the public sector as a whole, or UK society at large.

“The next wave of digital transformation therefore needs to be data-first modernisation,” said Macdonald. “This means restructuring the organisation around the fundamental data building blocks that define its reason for being and following the data journey from the point of collection/creation at the edge through to the transactions that it enables in back office systems and onto how data is analysed, how insights are gleaned from it and how those insights inform decision-making and strategy going forwards.”

Macdonald also points out that the structure of the public sector means that no single organisation has a complete view of UK citizens, businesses and other stakeholders. It therefore would not be lawful, appropriate or justifiable to create a ‘UK data lake’.

“Rather, data is highly distributed and we need to take a distributed approach which involves better interoperability between public sector organisations as well as with relevant private sector companies, where it is necessary to gain a better understanding of the ‘customer’ and drive better service outcomes,” he said.

Finally, the public sector needs to learn from digital transformation in the private sector –  for example, financial services where interoperability, data standards and APIs are driving innovation and providing customers with a better experience and greater choice.

Said Macdonald: “Everyone in the public sector, from senior civil service to executive officers, should understand the role data plays in their organisation and how to use it to achieve their goals, improve outcomes and drive best value.”

Both the HPE Public Sector Data Strategy Report, which uses FoI data from UK public sector, and the House of Data original documentary series are available on the HPE UK public sector website. You can also access the HPE Data Value Creation Maturity Model to find out more about HPE’s global research results on data maturity for public sector.