For much of the past decade, AI in the public sector has lived at the margins: pilots, proofs of concept, innovation labs and controlled trials. In 2026, that phase is ending. AI is moving into the operational core of government, supporting how the state functions day to day.

“In 2026, the UK public sector will move decisively from AI experimentation to AI-powered transformation,” said Prashant Kale, CTO of Scrumconnect. “Departments will start embedding data and machine intelligence into the plumbing of core services.”
That shift will be felt most acutely in frontline environments. Kale points to healthcare, border control and policing, where AI supports diagnostics, screening and resourcing decisions. But with scale comes scrutiny.
Uneven adoption, inevitable direction
For Dwayne Johnson, chief officer local government at ICS.AI, adoption is unavoidable, even if maturity varies widely. “By the end of 2026, I envision 100 percent of councils will be using AI in some form,” he said. “The shift will be uneven, but its direction is inevitable.
“We’re also likely to see between 30 and 50 percent of local authorities and public sector organisations having moved beyond lightweight tools and adopted some form of agentic AI.”
The unevenness, Johnson argues, will reflect culture and confidence more than capability. “The shift will be uneven, but its direction is inevitable,” he says. “Councils that prepare early – both culturally and organisationally – will be the ones who unlock its full potential for their communities.”
That preparation goes well beyond technology. Johnson points to a deep equality gap in how citizens experience AI. “Many residents understand what Alexa does, or how personalised systems like Netflix work, yet many have never had access to these AI tools themselves,” he says. “That divide means that when councils talk about AI-enabled services, they often speak from a completely different starting point than the people they serve.”
From invoice processing and procurement to citizen queries, AI is increasingly seen as a “work companion”. But AI should augment professional judgement, not replace it.
If you liked this content…
“The aim here is not to replace professional judgment,” added Carrie Ramskill, COO at HGS UK, describing AI tools used in Continuing Healthcare, “but to give teams more time to focus on the people behind the assessments.”
Trust, scepticism and the cost of overpromising
If scale is inevitable, trust is not. Johnson notes the damage caused by inflated expectations. “The biggest barrier of adoption will not be the technology itself, but trust,” he says. “Years of over-marketed products have left many officers wary of shiny, overpromised AI solutions that deliver very little substance.”
That scepticism, he argues, is rational. “Too many AI promises have turned out to be ‘fur coat, no knickers’. Great on the surface, nothing substantial underneath.”
For Scott Bridgen, GM risk & audit at Diligent, this makes governance non-negotiable. “When these systems advise citizens on legal rights, benefits or taxes, the stakes are incredibly high,” he says. “There should always be a human in the loop to validate decisions being made by AI, especially when it can impact people’s livelihoods or legal standing.”
Bridgen also cautions against inappropriate deployment. “While governments may see benefits to providing services through consumer-grade messaging platforms… purpose-built platforms designed with compliance, data sovereignty and sector-specific security standards are far better suited.”
Skills and culture as the bottleneck
Almost every industry watcher we spoke with identifies skills – not software – as the limiting factor. Kale expects AI capability to become formalised. “Expect mandatory AI training, clearer digital career paths, and targeted hiring,” he says, so teams can “use these tools well, question them confidently, and keep improving them.”
Mark Gibbison, AVP global public sector, Unit4 agrees, arguing that AI literacy must extend beyond technical teams. “There will definitely need to be a focus on AI literacy for non-technical leaders in the public sector,” he says, noting that leadership comfort with AI directly influences adoption and trust.
Johnson emphasises staff support during transition. “Successful implementation depends on supporting staff through the transition, especially when capacity is stretched,” he says. Expect scepticism to remain – and to be healthy.








