Whilst digital transformation is not a new concept for those operating within the public sector, the adoption of modern technologies has traditionally been hampered by budgetary restraints and compliance concerns. However, 2020 changed this. The pandemic and subsequent lockdowns forced the majority of public sector services online, demonstrating just how essential having the right IT systems in place is for business continuity and information accessibility.
To continue to deliver services to the public, all organisations – from healthcare and education establishments to local and central governments – have had no choice but to transform at speed, adopting a myriad of new technologies. In fact, recent reports reveal that in the 12 months leading up to August 2020, public sector procurement spending on IT and telecoms saw an increase to £7.8 billion.
This digital momentum is showing no signs of slowing down, with the UK government recently announcing the launch of its National Data Strategy. This puts data at the heart of the UK’s COVID-19 recovery plan, proposing an overhaul of data usage across the public sector and encouraging the digital transformation projects needed to support it.
If public sector organisations are to reap the rewards of this new digital landscape, they need to ensure that they have the infrastructure in place to effectively manage and understand their data. However, in many cases, this is much easier said than done.
A siloed landscape
Whereas private sector organisations utilise data insights to drive profit and boost productivity, the public sector relies upon data to improve the lives of the general population. Whether this is within healthcare, social services, policing or the judiciary, data holds the key to improving services at a lower cost and making multi-agency co-operation more efficient.
However, data – and the insights it provides – is notoriously difficult to extract. Public sector data is often buried in a hugely complex network of applications and siloed databases. Different agencies ‘own’ different pots of data and each use contrasting semantics or formats to define and categorise that data. To make matters more difficult, legacy IT systems – which are still in place in many areas of the government and the National Health Service (NHS), for example – were not designed with sharing capabilities in mind. This makes it extremely difficult to find and retrieve data, let alone translate it into meaningful, actionable information.
You might also like
Traditional means of sharing data across various public sector agencies are no longer fit for purpose. In fact, the most common method of data integration, ETL – where data files are extracted from an existing source, transformed into a common format and loaded onto a new location – has been around since the 1970s. It’s no surprise that this technique’s limitations, most prominently around security and governance, are becoming increasingly apparent in our data intensive age.
Taking back control
Public sector organisations need to be able to share and access data, without compromising on security and governance. Copying data to data lakes or sharing it through multiple files can very quickly add to IT team’s workloads, increase data management costs and leave organisations more vulnerable to data theft.
A logical data fabric, built on data virtualisation, can help to combat this. Through connecting to almost any data source, data virtualisation enables organisations to give individuals across various agencies access to the data that they need, no matter where it is stored. It also enables data to be conveniently accessed through front-end solutions, such as applications and dashboards, without the user having to know its exact storage location. This means that those working across the public sector can access the information that they need quickly and easily, regardless of its format or location.
Data virtualisation enables all of this without the need to copy or move data. In other words, organisations can remain in control of who accesses what. It’s no wonder that the popularity of data virtualisation is increasing, with Gartner tipping that “through 2022, 60 percent of all organisations will implement data virtualisation as one key delivery style in their data integration architecture.”
Whilst recent months have presented many challenges for the public sector, they have also created many opportunities. There’s no doubt that the pandemic has acted as a catalyst for change, significantly accelerating digital transformation. Modern technologies and a newly-established openness to embrace them is likely to have a lasting impact on the entire sector. However, to be successful in this new data-driven landscape, public sector organisations must create an architecture which is conducive to the sharing of information and resources, without compromising on security. It is only then that they can put themselves in the best position for whatever comes next.
Charles Southwood is regional VP, Northern Europe and MEA at Denodo