The risks around artificial intelligence (AI) currently dominate news headlines, with attention now turning towards regulation. However, in the case of social care, being too pre-emptive and overlooking the potential of AI to revolutionise the social care sector could be extremely damaging, argues Intelligent Lilli CTO, Sameer Vartak.

Lilli uses machine learning and remote monitoring sensors to collect data on patient baselines. Its technology identifies when behaviours deviate from the norm, helping to better connect carers with those they care for. The company currently has pilots live with Dorset, Nottingham City, North Tyneside and Reading Councils, helping to enable independent living whilst support social care workers.
“The entire health and social care system is gridlocked and under immense strain. Growing demand and resource constraints have stretched the sector – and its amazing staff – to its limit, which has resulted in workforce shortages and unprecedented waiting list backlogs. It’s essentially left people stuck in the wrong part of the system. In fact, as many as one-in-three English hospital beds are now occupied by patients whose discharge has been delayed due to adult social care pressures,” says Vartak.
“It’s a whole-system challenge that’s risking health outcomes across the board. At Lilli, we are breaking the cycle by using technology to tackle the issues in social care first, recognising that innovation has the power to transform outcomes across the entire health and social care ecosystem.”
In light of current concerns over AI, Vartak says that by “getting too far ahead of ourselves and focusing on regulations and restrictions, we risk prematurely stifling its growth and innovation without reaping any of the rewards. AI has the potential to revolutionise the social care sector and improve countless lives, similar to what we’ve seen with machine learning.”
“Consider AI’s capacity to quickly analyse large-scale data sets, such as patient records or social determinants of health. Much earlier intervention becomes possible, which leads to better resource allocation and optimised care outcomes. This can help us tackle the most pressing challenges in the sector like increased demand and workforce burnout, as improved caseload management saves precious care hours and allows staff to make more proactive, data-driven decisions.
Real-time insights they need to support care decisions
Vartak says Lilli’s remote monitoring technology gathers data on a person’s behaviours within their home. The data provides social care workers with the real-time insights they need to support care decisions, such as the rightsizing of care packages for people being discharged from hospital, or allocating precious resources to where they are needed most.
If you liked this content…
“Social workers have good instincts about a vulnerable person’s care requirements – but they are challenged daily about the decisions they make, and often lack the concrete evidence to back them up. At Lilli, we support social care workers by putting evidence-based decision-making capabilities firmly in their hands. This ultimately leads to the best outcomes for service users, enabling early intervention and allowing them to live independently for as long as possible.”
The technology also brings indirect system-wide benefits, from reduced bed blocking to fewer GP appointments and ambulance call outs, relieving the many pressures facing the NHS.
“This brings significant cost and resource efficiencies at a time of great need – the evidence from early trials shows Lilli can accelerate hospital discharge by 16 days and generate over 9,000 extra care hours across two councils. Importantly, it also improves the quality of life for our most vulnerable and those working across the sector,” says Vartak.
“There’s also a growing recognition that this sector must become more technology and data-driven to meet heightened demands. This is why Lilli is collaborating with leaders in digital care management, including Nourish Care, to help local authorities leave the paper trail behind and successfully implement digital social care records (DSCR) to meet ambitious government targets for 80 percent of adult social care providers to have a DSCR solution in place by next year.”
Ensuring responsible use of AI in social care

Vartak does acknowledge there are risks around AI that must be understood and addressed. “Right now, large language models like GPT are general purpose tools that shouldn’t be relied on for decision-making in care, as they may be trained on biased data or give ‘confidently incorrect’ answers. Care must also be taken to protect personal and sensitive details about individuals. We’re not yet at a stage where a care worker can use generative AI to make decisions on what medication a person should take, for example,” he explains.
“To ensure responsible use of AI in social care, we should implement transparent governance frameworks and privacy protection measures, as well as human oversight. It’s all a learning cycle – and AI innovations need to be approached with an open mind, taking the time to fully understand the opportunities and risks.”





