Can AI make social care human again? What technology can genuinely do for the people at the heart of care
Outstanding care does not begin and end at the hospital door. It is delivered by the 1.5 million people working across adult social care in England, in care homes, in people's own homes, in supported living settings.
The people they support often have complex, long-term needs in the system. Getting digital transformation right in social care matters enormously, for residents, for families, and for the wider health and care landscape that depends on it.
Being included in the fellowship was a recognition of that. Walking into that first cohort session, clinicians, researchers, allied health professionals, and me, representing social care, felt both humbling and right. The conversation about what technology can do for people in care settings is stronger for having social care in the room. I believe that, and the fellowship reinforced it.
The tools that changed how I think
I came into the fellowship having spent many years leading digital transformation in care, implementing systems and building digital capability across a group of homes.
I had hands-on experience. What the fellowship gave me was a framework to build on those instincts. One session that has genuinely changed how I work was on storytelling in projects, run by the team at TPXimpact. The insight at its heart is simple: facts alone rarely move people.
When you are trying to create change, whether that is improving care outcomes or shifting how a team thinks about technology, you need people to feel something before they will do something differently.
That landed directly in my fellowship project on falls prevention. Rather than producing a report, I worked with key stakeholders to create a series of short videos, residents, family members, care colleagues, clinicians, sharing their experiences and perspectives in their own words.
The aim was to build awareness and empathy around a challenge that affects thousands of older people every year but often gets reduced to an incident statistic.
You can watch them at Topol Fellowship Videos - Peverel Court Care. People who watched them, including experienced care professionals, told me they saw the problem differently afterwards. That is what good storytelling does.
The stakeholder mapping work pushed me to be disciplined about who needed to be involved, who needed to be informed, and who needed to be genuinely central to any change, which in care is always the resident or person being supported and the reframing session was the one that shifted my perspective: taking a problem that presents as one thing and mapping the real drivers beneath it.
Falls are not just about equipment or physical environment. They are about knowledge, training, confidence, relationships, and design. Reframe the problem properly, and the solution space opens up entirely.
I carry all three methodologies with me now as the way I approach any significant change, including the introduction of AI.
The connections that stay with you
One of the things nobody tells you about a fellowship is how much the people matter. I made connections across the cohort that I genuinely value, professionals from very different parts of the health and care landscape wrestling with similar questions from different angles.
One friendship that has endured is with Rav Sekhon, Operations Director at Balance Care, a fellow from the north of England. We have stayed in touch after the fellowship ended.
Earlier this year I travelled to HETT, where AI in health and care was front and centre on the agenda, and stayed with Rav and his family, and then visited the event together.
That kind of connection, built through shared learning and sustained by genuine friendship, is something the fellowship makes possible in a way that an online CPD module simply cannot.
Where the thinking has led, AI and the opportunity for social care
The fellowship gave me better tools for thinking about digital change. Now I am applying them to what I believe is a defining question for adult social care right now: how do we adopt AI in a way that genuinely serves the people at the centre of it?
My view, formed through practice, is that AI adoption in social care is not primarily a technology challenge. It is a people challenge, one of culture, trust, and relationships.
The principle I keep returning to is this: AI surfaces the signal. The human makes the decision. Always.
Three areas stand out as genuinely transformative, not in theory, but in what is already becoming possible.
Care plans that stay alive and personal
One of the challenges of care planning can be drift. In any busy setting, care plan language might become generic over time, task-focused, deficit-led, describing what a person cannot do rather than who they are.
AI that continuously audits care plans for this kind of quality erosion, flagging where a resident's individual voice has faded and prompting teams to restore it, is not replacing professional judgement. It is acting as a conscience, catching the slow erosion of person-centredness that happens when teams are stretched.
A care plan should be a living document. AI can help keep it that way.
Rotas that work for people, not just the spreadsheet
Scheduling in care is complex and time-consuming, contracted hours, skill mix, working time rules, individual preferences, absence history all have to be held simultaneously. AI-assisted scheduling can analyse all of those variables at once and surface a recommendation that is transparent at every step, showing its reasoning rather than hiding it.
The manager always makes the final call. But they make it faster, better informed, and with less of the administrative load that currently eats into time that could be spent on care.
Recruitment that finds the right people
Care faces a well-documented workforce challenge. AI tools that support, rather than replace, recruitment decision-making can help teams identify candidates more likely to stay, to thrive, and to fit the culture of a home.
The distinction is important: AI that surfaces the best candidates for a human to assess is fundamentally different from AI that makes the hiring decision. In a sector built entirely on relationships, that boundary is not negotiable.
What connects all three is the same idea. AI earns its place in care when it removes the administrative friction standing between a carer and the moment that actually matters, not when it replaces human judgement, but when it gives time, clarity, and confidence back to the people doing the work.
The bigger picture
Social care has a real opportunity to adopt AI on its own terms, with the governance, ethics, and human values of the sector built in from the start, rather than inherited from systems designed for somewhere else.
The fellowship helped me understand how to approach that — through storytelling, through genuine stakeholder engagement, through reframing the problem before reaching for a solution.
Those methodologies came from a falls prevention project. They apply just as directly to the responsible adoption of AI.
The question I now ask of any AI tool is the same one TPXimpact taught me to ask at the beginning of any project: who is this actually for, and what do we want them to think, feel, and do differently as a result?
Keep that question at the centre, and AI will not make social care less human. It will make it more so.
If AI in health and social care is something you are thinking about, or excited by - follow me on LinkedIn and talk.
Page last reviewed: 5 May 2026
Next review due: 5 May 2028