Artificial intelligence (AI) has the potential to transform public services, with productivity gains identified as a game-changer within government. By 2023, 70 percent of government agencies were testing or planning AI projects, showing a clear move towards using these new technologies.
“There is now a much stronger drive in the government to adopt these new technologies to improve processes, drive efficiencies and increase productivity,” said James Poulten, lead data scientist at Made Tech (pictured below).
But with AI dominating discussion among both business and government leaders, it can be easy for organisations to get caught up in the hype and shift their focus from the user needs driving its adoption.
“It’s become my job to say, ‘love the enthusiasm.’ Let’s take a moment to evaluate what you actually need,” said Poulten. “There will often be a data science tool that we can implement to help you with your unique business case. Sometimes you might need a generative AI model. But in a lot of cases, more traditional data science techniques can really make an impact.”
Similarly, while government organisations want to see the benefits of AI, they must ensure the groundwork is prepared in terms of both their data, and their infrastructure.
“The reality is there’s a lot of work that needs to go into building these technologies, even in a data mature organisation,” said Poulten. “It’s not a case of click a button and suddenly you have a fully integrated machine learning pipeline. It takes a lot of data preparation, many training iterations, and lots of thinking about security. You need to make sure that there are people still in the decision chain so that it doesn’t just become an automated, overlord system.
“We find a lot of organisations aren’t quite at the stage where they can really harness this technology,” he added. “Their data maturity is low. They could still be working on siloed data sources or with Excel sheets. They’ll probably need a bit of uplift or a piece of work around how they interact with data in general before they can really adopt these technologies.”
Before you go all in on AI
In terms of advice for organisations looking to embark on their AI implementation journey, Poulten believes that the most important step on any journey is the next one.
“The most important thing you can do is just make the decision to change. To say, ‘Okay, we can’t keep working like this. It’s working for now, but we need to start thinking about the long-term strategy.”
Another key piece of advice is to think beyond your new ‘shiny’ predictive model that’s just been built for you, and make sure that you have the underlying infrastructure to support your long-term strategy. Think about the backend as well the infrastructure. A strong team of data engineers and data architects can build a lasting environment to help your organisation quickly adopt new technologies as they come along.
If you liked this content…
“Often on data projects, clients will overlook other disciplines. You end up with data ninjas or data rockstars. But it’s all about building a multi-disciplined team. I’d never advise someone to work on the principle of ‘one person can do it all,’” said Poulten. “A data scientist doesn’t have the same expertise as a systems designer or a user researcher or an interaction designer. The data scientist doesn’t have the same expertise as a data engineer when it comes to robust architecture and data pipelines, and the automation of AI. When it comes to delivering a successful data or AI project, a team with a wide spread of skills is crucial.
AI success stories in the public sector
That said, there are examples where the public sector is successfully putting AI to work. One way they’re doing this is by looking at retrieval augmented generation (RAG), said Poulten.
“Put simply, RAG allows users to use chatbots to quickly query hundreds, if not thousands, of pages of documentation,” he explained. “You can ask it a question that a human would ask, and it goes through the documents, finds the relevant points, and returns it to the person doing the search. RAG is looking to be one of the few truly meaningful use-cases for LLMs.”
Elsewhere, Poulten and his team have worked with Skills for Care, a regulatory body in the adult social care sector. They need to supply data on the number of workers in adult care homes throughout the UK.
“The process of collecting all that data and producing that number was taking Skills for Care well over a month of work each time they were asked for it,” said Poulten. “They had a team of three or four analysts ingesting the data, deduplicating it, dealing with the data transforms that were required, in order to eventually predict the number of workers. They would then feed this back to government ministers and the Cabinet Office.
“We worked with them to automate the majority of the process and developed a number of AI models that utilised more of the data they were collecting to generate predictions. It was a learning process for them, and their analysts really developed a lot, going from no code to data engineers in a matter of months.”
Made Tech was able to take a task that previously took the team over a month, to a process that now ingests new data every two weeks and takes 20 minutes to run the pipeline.
“Suddenly, you have a massive gain in efficiency and freed up a whole team of people to go and do the rest of their job,” said Poulten.
Made Tech also helped the Met Office interrogate hundreds of thousands of individual pieces of user feedback from millions of daily users of the Met Office public website and mobile apps. By automating the sentiment analysis of vast amounts of user feedback, the Met Office could extract patterns and trends with an efficiency that was previously impossible. Combined with traditional methods of user research, the company was able to provide in-depth, qualitative insights into user behaviours.