As the volume of data continues to grow exponentially, data professionals in both the public and private sectors are grappling with how to harness its power while maintaining public trust. At a recent Think Data for Government panel discussion on the future of data, speakers from a range of organisations (pictured) shared their perspectives on the challenges and opportunities ahead.

One of the key themes that emerged was the need to move beyond simply collecting and storing data, towards extracting meaningful insights that can drive better decision-making. As Adobe CTO Tony Stranack noted: “Our challenge today is extracting information from that data. The leaders of the future will be those that can get information rather than just creating more data.”
This sentiment was echoed by Dr. Laura Gilbert, chief analyst, director of data science at 10 Downing Street. She emphasised the importance of using AI and other tools to “make the government more human and get out better public services really quickly.”
However, she stressed the need to do so in a transparent and trustworthy manner, by “telling people what we’re doing, by releasing the code, by proven we’re not, in fact, using a AI to try and take people’s benefits away, to take people’s jobs away, or to replace you.”
This challenge is particularly acute in the public sector, where citizens may have high expectations of government’s data capabilities, but low levels of trust. “What I would like to do is publish all of the data we have on everything not related to national security, etc. and allow people to test what we’re actually doing,” said Dr. Gilbert.
Building that trust with the public was a recurring concern among the speakers. “Trust and guidance is relevant. People believe Facebook ads before they believe us. And the only way that you can really get people to trust you is by actually being trustworthy,” said Giuseppe Sollazzo, deputy director, head of data enablement at DWP.
Presenting data in new ways
Stranack believes the public sector has to be more creative in the way that it presents data, “and think of the different audiences that might want to consume that and present something a completely different way to something else.”
If you liked this content…
To that point, Simon McLellan, head of data engagement at The Met Office, shared how his organisation is experimenting with different ways of presenting weather data.
“We had a new version of the app. We looked at the Beta version and asked, do you find this more useful than our other version? It’s the same forecast behind it, but it was a different way to present that forecast within that Met Office app, and whether people found that one more useful than this one.”
This user-centric approach extends beyond just data presentation, to the very process of data collection and usage.
“What we do less well, potentially in public sector, is ask people what they think, and have a mechanism where we allowed to do that safely,” noted Traveline chief executive Julie Gray.
The speakers also highlighted the importance of data quality and the need to address issues of bias, incompleteness, and inaccuracy.
“The reality is that the data exhausted, incorrect, incomplete, not capturing the right set of things that we need to catch So there’s a lot of things we can do to gather better data,” said Sollazzo, who emphasised the need to understand “data lineage” – the provenance and quality of data in order to make it truly usable for future purposes.
Ultimately, the speakers painted a picture of a future where data professionals must strike a delicate balance between generating insights and maintaining public trust.