The future of elder care is here – and it’s artificial intelligence

Kellye Franklin recalls the devastation when her now 81-year-old father, a loyal air force veteran, tried to make his own breakfast one morning. Seven boxes of open cereal on the living room floor with milk poured directly into every one of them. He would later be diagnosed with moderate to severe dementia.

The Elder care assistive robots market is anticipated to secure a CAGR (Compound Annual Growth Rate) of 12.2% during the forecast period. The market is valued at USD 2.2 Billion in 2022 and is likely to be valued at USD 7.1 Billion by 2032. - Future Market Insights

Yet Franklin, 39, who is her dad’s only child and his primary caregiver, does not worry about that repeating now.

In late 2019, she had motion sensors that are connected to an artificial intelligence (AI) system installed in the two-floor townhome she and her dad share in Inglewood, in Los Angeles county. Sensors at the top of doors and in some rooms monitor movements and learn the pair’s daily activity patterns, sending warning alerts to Franklin’s phone if her dad’s normal behavior deviates – for instance if he goes outside and doesn’t return quickly.

“I would have gotten an alert as soon as he went to the kitchen that morning,” she says, because it would have been out of the ordinary for her dad to be in the kitchen at all, especially that early. Franklin says the system helps her “sanity”, taking a little weight off an around-the-clock job.

Welcome to caregiving in the 2020s: in rich societies, computers are guiding decisions about elder care, driven by a shortage of caregivers, an ageing population and families wanting their seniors to stay in their own homes longer. A plethora of so called “age tech” companies have sprung up over the last few years including to keep tabs on older adults, particularly those with cognitive decline. Their solutions are now beginning to permeate into home care, assisted living and nursing facilities.

The technology can free up human caregivers so they can be “as efficient as potentially possible” sums up Majd Alwan, the executive director of the Center for Aging Services Technologies at LeadingAge, an organization representing non-profit ageing services providers.

But while there are potential benefits of the technology in terms of safety for older people and a reprieve for caregivers, some also worry about its potential harms. They raise questions around the accuracy of the systems, as well as about privacy, consent and the kind of world we want for our elders. “We’re introducing these products based on this enthusiasm that they’re better than what we have – and I think that’s an assumption,” says Alisa Grigorovich, a gerontologist who has also been studying the technology at the KITE-Toronto Rehabilitation Institute, University Health Network, Canada.

Technology to help keep seniors safe has been in use for a long time – think life alert pendants and so called “nanny cams” set up by families fearful their loved ones could be mistreated. But incorporating systems that use data to make decisions – what we now call AI – is new. Increasingly cheap sensors collect many terabytes of data which is then analyzed by computer scripts known as algorithms to infer actions or patterns in activities of daily living and detect if things might be off.

A fall, “wandering behavior”, or a change in the number or duration of bathroom visits that might signal a health condition such as a urinary tract infection or dehydration are just some of the things that trigger alerts to carers. The systems use everything from motion sensors to cameras to even lidar, a type of laser scanning used by self-driving cars, to monitor spaces. Others monitor individuals using wearables.

CarePredict, a watch-like device worn on the dominant arm, can track the specific activity that a person is likely to be engaged in by considering the patterns in their gestures, among other data. If repetitive eating motions aren’t detected as expected, a carer is alerted. If the system identifies someone as being in the bathroom and it detects a sitting posture, it can be inferred that the person “is using the toilet”, notes one of its patents.

The system in use in the Franklins’ home is called People Power Family. An addition to it, targeted at care agencies, includes daily reports tracking when someone fell asleep, whether they bathed, and bathroom visits. “You can manage more clients with fewer caregivers,” says the promotional video.

The large blue warning signs read “Video recording for fall detection and prevention” on the third-floor dementia care unit of the Trousdale, a private-pay senior living community in Silicon Valley where a studio starts from about $7,000 per month.

In late 2019, AI-based fall detection technology from a Bay Area startup, SafelyYou, was installed to monitor its 23 apartments (it is turned on in all but one apartment where the family didn’t consent). A single camera unobtrusively positioned high on each bedroom wall continuously monitors the scene.

If the system, which has been trained on SafelyYou’s ever expanding library of falls, detects a fall, staff are alerted. The footage, which is kept only if an event triggers the system, can then be viewed in the Trousdale’s control room by paramedics to help decide whether someone needs to go to hospital – did they hit their head? – and by designated staff to analyze what changes could prevent the person falling again.

“We’ve probably reduced our hospital trips by 80%,” says Sylvia Chu, the facility’s executive director. The system has captured every fall she knows of, though she adds that sometimes it turns out the person is on the ground intentionally, for example to find something that has fallen on the floor. “I don’t want to say it is a false alarm … but it isn’t a fall per se,” she says. And she stresses it is not a problem – often the resident still needs help to get back up and staff are happy to oblige.

“We’re still just scratching the surface,” when it comes to accuracy, says George Netscher, SafelyYou’s founder and CEO. Non-falls – which the company refers to as “on-the-ground events” – are in fact triggering the system about 40% of the time, he says, citing someone kneeling on the ground to pray as an example. Netscher says that while he wants to get the error rate down, it is better to be safe rather than sorry.

Companies must also think about bias. AI models are often trained on databases of previous subjects’ behavior, which might not represent all people or situations. Problems with gender and racial biases have been well documented in other AI-based technology such as facial recognition, and they could also exist in these types of systems, says Vicente Ordóñez-Roman, a computer vision expert at the University of Virginia.

That includes cultural biases. CarePredict, the wearable which detects eating motions, hasn’t been fine-tuned for people who eat with chopsticks instead of forks – despite recently launching in Japan. It is on the to-do list, says Satish Movva, the company’s founder and CEO.

For Clara Berridge, who studies the implications of digital technologies used in elder care at the University of Washington, privacy intrusion on older adults is one of the most worrying risks. She also fears it could reduce human interaction and hands-on care – already lacking in many places – further still, worsening social isolation for older people.

In 2014, Berridge interviewed 20 non-cognitively-impaired elder residents in a low-income independent living apartment building that used an AI-based monitoring system called QuietCare, based on motion detection. It triggered an operator call to residents – escalating to family members if necessary – in cases such as a possible bathroom fall, not leaving the bedroom, a significant drop in overall activity or a significant change in nighttime bathroom use.

What she found was damning. The expectation of routine built into the system disrupted the elders’ activities and caused them to change their behaviour to try to avoid unnecessary alerts that might bother family members. One woman stopped sleeping in her recliner because she was afraid it would show inactivity and trigger an alert. Others rushed in the bathroom for fear of the consequences if they stayed too long.

Some residents begged for the sensors to be removed – though others were so lonely they tried to game the system so they could chat with the operator.

A spokesperson for PRA Health Sciences, which now makes QuietCare, noted the configuration studied in the paper was a historical version and the current version of QuietCare is only installed at assisted living facilities where facility staff, rather than relatives, are notified regarding changes in patients’ patterns or deviations in trends.

Berridge’s interviews also revealed something else worrying: evidence of benevolent coercion by social workers and family members to get the elders to adopt the technology. There is a “potential for conflict”, says Berridge. Another of her studies has found big differences in enthusiasm for in-home monitoring systems between older people and their adult kids. The latter were gung ho.

Though sometimes the seniors win the day. Startup Cherry Labs is pivoting partially because it ran into problems obtaining seniors’ consent. Its home monitoring system, Cherry Home, features up to six AI cameras with sound recorders to capture concerning behavior and issue alerts; facial recognition to distinguish others in the space such as carers from seniors; and the ability for family members or carers to look in on how the senior is doing in real time.

But Max Goncharov, its co-founder and CEO, notes that business has been tough not least because adult children couldn’t convince their parents to accept the system“The seniors were against it,” he says. Cherry Labs now has a different application – targeting its technology at industrial workplaces that want to monitor employee safety.

Franklin, in Inglewood, says the fact her system uses motion sensors rather than cameras is a big deal. She and her dad, Donald, are African American and she just couldn’t imagine her dad being comfortable with a video-based system. “He was born in 1940 in the south and he has seen the evolution and backpedaling on racial issues. He definitely has some scars. There are various parts of our American culture he is distrustful of,” says Franklin.

She has done her best to explain the monitoring system, for which she now pays $40 a month, simply and without sugar-coating. For the most part, he’s all right with it as long as it helps her.

“I never want to be a burden,” he says. But he also wants her to know that he has a plan if they ever decide the technology is too invasive: they can move out of their townhome and rent it to someone else.

“You have to have a trick bag to protect yourself from their trick bag,” he tells her. “I am still your dad no matter how many sensors you got.”

Reposted from: https://www.theguardian.com/us-news/2021/jun/03/elder-care-artificial-intelligence-software

Read more articles related to Elder Care: Elder Care Series

Related: 

Amazon launches its $19.99 per month ‘Alexa Together’ elder care subscription for families


Companion Robots: A New Way to Betray the Elders

Comments

Labels

Show more

Popular posts from this blog

10 Best Natural Ozempic Alternatives 2024

10 Best Vitamin C Serums Recommended by Dermatologists 2024

9 Best Vitamin C Serums for Brighter Skin 2024

Can Diet and Lifestyle influence your Risk of getting Cancer? Let the Science Speak (2024)

10 Best Cosmeceutical Ingredients of 2024

7 Best Vitamin C Serums for Hyperpigmentation 2024

Linoleic Acid vs Linolenic Acid: What's the Difference?

Linoleic Acid vs Oleic Acid: What's the Difference?

7 Best Cetylpyridinium Chloride Mouthwash Brands 2023

10 Best Nasal Sprays for COVID-19 (2024)

Archive

Show more