Less than half (40 percent) of the UK public trust the public sector to use AI responsibly, according to new findings from Nesta’s Centre for Collective Intelligence (CCI).

The research forms part of CCI’s work to assess the “social readiness” of AI tools used in public services. Opinium surveyed more than 2,000 adults on their views of AI in areas including health, transport, education and social care.
Almost half (41 percent) of respondents believe AI is dangerous and should not be used in the public sector, compared with 29 percent who think it should be widely adopted. People were most comfortable with AI being used in the NHS (38 percent), transport (37 percent) and education (36 percent). Support was lowest for policing (28 percent), defence (28 percent) and social care (29 percent).
When asked what should matter more when deciding whether to deploy AI, respondents favoured public support (46 percent) over saving money (18 percent). More than half (52 percent) felt the public should be consulted before introducing AI in public services – only one in five said the decision should be left to experts.
Labour voters were most likely to prioritise public support (56 percent), while 28 percent of Conservative voters preferred cost savings.
Testing trust in social care
CCI also tested its AI Social Readiness Advisory Label through workshops on Magic Notes, a UK-developed AI tool that creates transcripts and meeting summaries for social workers.
If you liked this content…
Initial confidence in social care processes was low – only 13 percent of participants were satisfied with current delivery. But after discussion and review, 74 percent believed the benefits of Magic Notes outweighed the risks. A separate survey found 86 percent of the public and social care users thought the tool could improve services.
Nesta has recommended steps to build further confidence, including consent protocols, staff feedback mechanisms and monitoring the impact on waiting times, user experience and social worker job satisfaction.
‘Support will make or break AI deployment’
Kathy Peach, director of CCI, warned that public buy-in is essential: “The government’s AI Adoption plan is bound to fail unless there’s public support for AI in public services. Our advisory process helps build confidence and trust, especially in areas like social care that have a lot to gain from AI.”
Rachel Astall, chief customer officer at Beam, the developer of Magic Notes, said the consultation had given practical guidance. “It gave us a clear sense of what earns public confidence. We hope more organisations will take similar steps,” she said.
CCI plans to assess further public-sector AI tools in the coming months.








