Is data observability the solution to public sector data challenges?

Public sector faces specific challenges around managing data – Cribl argues that data observability is the answer

Posted 7 December 2021 by Christine Horton

The public sector would benefit from adopting an ‘observability-first’ approach to better manage its data.

That’s according to Nick Heudecker, senior director of market strategy at Cribl. Cribl creates controls to route security and machine data where it has the most value. Known as an observability pipeline, Cribl says it can it help cut costs, improve performance, and “get the right data, to the right destinations, in the right formats, at the right time”.

Observability differs to traditional data monitoring as it enables users to interrogate their environments without knowing in advance the questions they need to ask. It is more operational than analytics-based, so instead of running point to point solutions, users can take all event data – logs, metrics and trace data – and run it through a centralised point. Central to the concept of observability is putting decision making back into the hands of the organisation.

However, Heudecker says public sector organisations face unique hurdles when it comes to observability.

“These organisations fall under strict compliance and regulatory mandates requiring them to save data for long periods, even though it has questionable long-term value,” he explains.

“Data storage requirements consume vast amounts of high-cost infrastructure, which has its own outsized impact on already tight budgets. It’s difficult to make targeted optimisations around how and where data is stored across monitoring infrastructure. Many organisations simply keep everything because they don’t have control over where data can be sent.”

Additionally, he says public sector organizations, like their commercial counterparts, deploy multiple tools for logging, security, and performance monitoring. These tools often report much of the same data, leading to duplication across data sets.

“Sharing data across tools is difficult, if not impossible,” says Heudecker. “Vendors have no interest in easily sharing data across services. They want to deliver an end-to-end experience. While this is understandable from the vendors’ perspective, it also creates isolated data silos that are difficult to use across teams.”

He also notes that there is an increasing need to share data with outside parties, like in the case of a wide-ranging security investigation.

“An example is the SolarWinds breach. Piecemeal approaches to sharing data from dozens of different systems with outside parties are slow and error-prone,” says Heudecker.

“Adopting flexible observability infrastructure allows public sector entities to keep everything – all the data – in the most cost-effective way. An observability-first approach removes the barriers between the sources of data, like agents and logs, and their destinations. This lets teams share data across tools, as well as more easily introduce new tools without needing to deploy a fresh crop of agents or acquire even more expensive storage infrastructure.”