AI is showing great promise as a tool to support the future of social care. It presents an extraordinary opportunity to improve our nation’s elderly patients’ quality of care substantially. In March, the University of Oxford put on the AI in Social Care Summit. This led to experts untangling how AI technologies are revolutionizing the field of monitoring and caring for vulnerable populations. Dr. Caroline Green, director of research at the Institute for Ethics in AI, emphasized the potential benefits of AI. She stressed that while it may help, it cannot replace the irreplaceable human touch that’s essential to caregiving.
The summit brought together notable figures, including Thomas Tredinnick, the head of AllyCares, which employs sensors to monitor residents in care homes overnight. Tredinnick’s technology listens for atypical sounds or movements, giving caregivers important information to act on. Through innovative strategies this year, AllyCares saw an 81% emergency reduction compared to last year. This 95 percent decrease is possible thanks to their successful monitoring system.
Dr. Green likened AI tech to a “baby monitor”. He stressed that the aim is to complement—not replace—human caregivers. She stated, “AI can only be part of the solution but not the whole solution.” This sentiment echoed throughout the summit, with a collective understanding that while AI can aid in care provision, it should not supplant human interaction.
The UK currently has about 12 million people aged 67 or over, increasing to 13.7 million by 2032. Combined with their increasing numbers, this creates an urgent need for innovation that raises the bar on what’s possible. While Dr Green welcomed the use of AI, he warned against expecting it to be a cure all for the ongoing issues facing social care. “At the moment there’s no official government policy on guidance on the use of AI in social care,” she noted, underscoring the importance of regulatory frameworks to guide the integration of technology in caregiving.
Aislinn Mullee, deputy manager at a care home with a deployed ai technology shared her experiences with using these innovations. She added that it can be especially difficult to detect pain in non-verbal residents. The addition of a smartphone app named Painchek has completely changed this by allowing for the automatic detection of pain levels with much higher accuracy. Mullee affirmed that “the technology has made a huge difference” in assessing residents’ needs and collaborating with local GPs to evaluate pain medication.
With all of these advancements, Mullee cautioned against falling into the trap of over-relying on technology. “It can help with some of the administrative work, some of the operation of care, but it cannot replace that human touch,” she said. This viewpoint complements ongoing debates about the ethics and impacts of AI in social work more generally.
Christine Herbert’s 99-year-old mother, Betty, is one AI monitored resident so far. Herbert’s initial apprehensions about her mother’s welfare led her to request routine checks during her early days at the care home. This echoes the frustration felt by so many families who are grappling with their own unique challenges of bringing technology into caregiving spaces.
With the rapid widespread adoption of AI, innovators like Dr. Marco Pontin are improving patient care in many ways. Pontin helped create a new kind of robot that reacts to being touched by other humans. This innovation truly bridges the gap between emerging technology and holistic caregiving practices. As some of these technologies emerge, experts caution that they can’t replace the human touch.
Lee-Ann Fenge, former chair of the social care roundtable and leading voice in social care’s digital revolution, warned against adopting a “magic bullet” approach to AI. “There is a potential risk that AI is going to be seen as a panacea to some of those very big problems that we are seeing in social care provision,” she stated. As much as AI can improve those operations, it should never eclipse the need for human professionals,” Fenge stressed.
A spokesperson from the Department of Health and Social Care echoed similar sentiments, stating, “Making better use of AI in social care is exactly the kind of transformation we’re championing in our 10 Year Health Plan.” Overall, this approach hopes to move away from sickness treatment reactionary to a more community proactive care with the help of digital health solutions.
Here, Dr. Green makes the case for a cautious approach to introducing AI across social care systems. She raised alarms over the possibility for reduced human judgment in care environments. “Whether it’s going to be good for people who need support depends on how policy is going to shape up here,” she said, stressing the need for careful consideration of options available to individuals regarding AI’s role in their care.