Practical AI is the Academy’s new event series, launched to share the realities of deploying AI in the systems we all rely on – from transport and housing to energy, utilities, and more. The focus is on the practical decisions that determine whether AI delivers tangible improvements for people and places: how data is gathered, how risk is managed, how trust is built, and how engineers, policymakers, and adopters solve problems together.
The third discussion in the series focused on how in-home sensors are being used to support safer independent living for people with care needs in Kent. Our speakers explored what it takes to embed AI-enabled sensing into people’s homes in a way that earns trust, how care teams turn algorithmic alerts into interventions, and what enabling conditions are needed to scale.
It comes as the Academy works on the future of neighbourhood health for ageing coastal communities, where the challenge of supporting independent living at scale is acute. The discussion is also timely as the government’s 10 Year Health Plan sets out a vision built around three long-term shifts: moving care closer to people’s homes and communities, harnessing digital technology, and reorienting from a reactive model of treatment to a greater focus on prevention.
What the room knows
When a carer walks into someone's home, they perform an almost unconscious audit. Is the kitchen being used? Does the person seem rested? Has anything changed since the last visit? This kind of reading of a living environment is hard to teach and harder still to document. It depends on knowing the individual, accumulating impressions over time and noticing changes that live mostly in the carer's memory.
In Medway, in Kent, a council-owned technology company called Kyndi is trying to give carers a version of that baseline that doesn't depend on memory alone. Working with Circadacare, a startup co-founded by a computational neuroscientist, they have been installing smart sensors in the homes of people with care needs, including those living with dementia. The sensors track movement, temperature, sound and light, AI helps analyse patterns, and carers receive alerts and trend reports that tell them, before they walk through the door, whether anything looks different. Instead of trying to replace clinical judgment, the system is giving clinical judgment better information to work with.
AI will never replace those lovely human faces and the compassion that people have and really benefit from those who look after them. But AI can make the time that carers and health workers spend with people more qualitative, because it gives us the ability to anticipate what’s wrong.
Councillor Teresa Murray, Deputy Leader, Medway Council
The demand problem
Demand for adult social care in England has been rising for years, partly fuelled by an ageing population and the increasing prevalence of complex conditions like dementia. Local authorities face a tension between the cost of residential care and the strong preference of most people and clinicians for supporting independent living for as long as possible.
With the number of people with dementia projected to continue rising in Medway, the Council focuses on prevention: catching changes early, intervening before crises develop and reducing the reactive pressure on acute services.
Kyndi sits at the centre of this strategy. It began as a council department responsible for telecare services before being converted into a Local Authority Trading Company, a model that gives it the commercial flexibility to run proof-of-concept trials, horizon-scan new products and manage its own funding cycle, while remaining accountable to the council. Rob Kennedy, who leads Kyndi's business development, described its role: “we operate almost like a bridge between health and social care, and between councils and the technology market.” The health and care workers don't need to know what products exist, Kyndi does that scanning for them.
The core product is a smart bulb that supports your circadian system by mimicing the natural rhythm of light across the day: very bright in the morning, dimming gradually through the afternoon, settling into a warm glow in the evening.
This is a well-grounded application of what is known about how light affects sleep, mood and cognitive function, particularly in people with dementia. The design logic is that disrupted sleep is both a major symptom and an accelerant of dementia. Restoring something closer to natural light cycles can help stabilise it. The bulb contains a network of sensors that monitor sound, temperature, humidity, light and motion throughout the property.
What the AI does
The system collects continuous data from the home environment. AI's job is to establish what is normal for a specific individual and flag when something departs from that.
Circadacare smart bulbs
The longer they can live well, the better they will enjoy that later life, and assistive technology is absolutely, for us, part of that picture
Councillor Teresa Murray, Deputy Leader, Medway Council
Tallie gave an example of how this works for acoustic monitoring. The sensors can classify sounds including household activities, coughing, snoring and sounds of distress. If someone has a chronic cough, the system learns this and does not alert to it. But if their coughing pattern changes, for example becomes more frequent at night , that change initiates an alert.
As another example, one user began making sounds of distress late at night, and the system raised an alert. When a carer attended, she found the woman leaning out of bed, trying to reach a television remote she had dropped. Without the intervention, she was on the trajectory towards a fall. With it, the situation was resolved without harm.
Most of the AI currently deployed is what Tallie describes as classification models, which are comparatively interpretable by design. As the models become more sophisticated, the question of explainability becomes more pressing. Care teams need to understand what the model flagged and also why. That is a technical and a governance challenge the company is working through as it develops its next generation of tools.
One next step is to use the accumulated behavioural data to track the progression of dementia between clinical reviews. Memory clinic appointments in England are typically every six or twelve months. A lot can change in that interval, and much of it goes unrecorded. Circadacare's ambition is to make that interval visible.
The human system around the algorithm
The technology is only one part of what Kyndi deploys. Every installation begins with a person-centred assessment. Rhiannon Price, who manages Kyndi's care technologist team, describes a process in which the kit is shown to the person, explained to them, and, where possible, to their family. The goal is to ensure the person understands what is being measured, who sees it, and what happens with the information.
Friction can emerge here, because people fear that monitoring signals a loss of autonomy rather than its extension. “People tend to think that it’s taking their independence away from them when it absolutely isn’t. What we’re here to do is to try and promote that independence so that they live a better life in their own home,” explained Rhiannon.
There are also practical complications unrelated to attitudes. Rob noted that not everyone they visit has a home internet connection. When that’s the case, Kyndi provides routers. Another issue is when homes have non-standard light fittings that the Circadacare bulbs can’t accommodate. Through trial and error, small desktop lamp stands became the solution.
Managing the alert system
The risk with monitoring systems is alert fatigue. Too many notifications and carers stop attending to them, too few and the system misses things that matter. Getting this right has been, according to our speakers, an ongoing process.
Tallie described a deliberate decision to err on the side of false positives, that is, to alert rather than miss. The team has worked closely with care teams to understand what level of alerting is tolerable, at what times of day, and in what format. Some alerts are immediate, others arrive as morning summaries of trend data rather than incidents.
Rob illustrated what the system provides with an example when the room heating drops during the day and rises at night. Initially this might generate alerts. Over time, it is recognised as the normal pattern for that household. But if the pattern suddenly changes and the property stays cold all day and all night, that change stands out clearly against the established baseline. “AI surfaces it, and a human then decides what it means,” explained Rob.
The interoperability gap
The system's value is constrained by how well the information it generates connects to the rest of the care and health ecosystem. Here, Kyndi and Circadacare have encountered a familiar obstacle: interoperability, or rather its absence.
Tallie described efforts from the outset to build the system so that it can both receive and send data across different platforms. In practice, finding willing and capable partners for this kind of data exchange has been difficult. The challenge is multiplied when it extends into NHS systems, where different trusts operate different infrastructure and integration timescales are long. The interoperability problem, Tallie said, “has been raised by people working for councils and all of the surrounding ecosystem for a long time. It’s been a very difficult nut to crack.”
The ambition is to build a unified picture of an individual, combining data from the sensor network with medication adherence, mobility patterns and carer observations, that could flag risk far before a crisis. “Hitting it right up at the preventative end is where you want to be” explained Rob. Whether it can be achieved at scale depends partly on technical solutions that are still being built, and partly on system-level decisions about data sharing that go beyond any individual product.
What regulators are missing
Councillor Murray noted that the Care Quality Commission, which inspects social care providers in England, currently makes no assessment of how providers use assistive technology, which is a significant gap. If the regulator that shapes the standards and incentives of the sector doesn’t treat technology adoption as a quality indicator, providers have less reason to prioritise it. There is an opportunity for regulators to be more demanding.
A similar dynamic applies in planning policy. Medway is expecting to build 29,000 new homes by 2040. Councillor Murray is pushing for a requirement that a proportion of new housing be with the infrastructure for assistive technology built in from the outset rather than retrofitted. The decisions being made now about how new homes are built will shape what is possible in care settings for decades.
What this tells us about practical AI
In this case, innovation is embedded in a set of human processes that give it meaning. For example, the individual calibration that allows AI to distinguish a significant change from background noise depends on a thorough initial assessment. The alerts that prompt action are only useful if care teams understand what to do with them, and trust that the system delivers reliable information. That trust is built through careful, iterative relationship management that our speakers described.
As Councillor Murray put it: “We are not trying to diagnose or advise [...]. We are literally just providing digestible information that then the experts know what to do with.”
Issues such as interoperability, regulatory incentives, housing design standards and the long-term accumulation of behavioural data, require action at a level above any individual product. Commissioners, planners, regulators and technology developers need to treat these questions as joint problems. In Medway, that conversation is already underway.
Takeaways
The Medway experience offers the following lessons for deploying AI in home care and wider service contexts:- Person-centred assessment is a prerequisite. The value of AI alerts depends on a baseline built through knowing the individual. Deployed without that foundation, it generates noise rather than insight.
- Start with easily interpretable models. Classification-based approaches allow care teams to interrogate what they are being shown and build trust. Once that trust is built, more sophisticated AI can follow.
- Build in partnership with users. The threshold, timing and format of notifications should be developed through ongoing dialogue with the care teams who will act on them.
- Deployment teams need the mandate and flexibility to overcome practical constraints.
- Interoperability requires system-level commitment. In the case of this technology, realising its full potential requires NHS and social care systems to treat data sharing as a priority.
- Regulators can help drive adoption. Regulators who treat technology adoption as a quality criterion can change the incentive structure for providers.
- New homes should be built for assistive technology from the outset. Planning policy that requires new housing to be technology-ready would reduce the long-term cost of supporting independent living at scale.