Mobile World Congress always has more than its fair share of weird. Last week at MWC, the winner’s prize for bonkers went to a Korean company called Hyodol, which proudly showed off a disturbing-looking ChatGPT-enabled companion doll aimed at older adults. Now, this $1,800 AI-enabled doll may well look like something you’d find in a haunted attic, but it’s actually meant to act as an interactive digital pal for people experiencing loneliness or in long term care facilities.
Thanks to the large language model stuffed inside the doll, the Hyodol can supposedly hold conversations with its owners, as well as provide health reminders such as when to take medication or eat a meal. It’s every bit as connected as you can imagine, with a companion app and web monitoring platform that lets caretakers monitor the device and its user from afar.
Hyodol’s sensor-laden ChatGPT-enabled dolls can interact with their owners.
It’s meant as a balm for the epidemic of loneliness, which has affected everyone from older adults in nursing homes to college students. Elizabeth Necka, a program director at the American National Institute on Aging, says there’s something to this kind of tech, particularly when used in nursing homes that are already suffering from widespread staffing shortages.
“The idea that there might be a low-cost solution that can mitigate feelings of loneliness is very attractive,” Necka says. “Whether or not ChatGPT can actually achieve those feelings of connections, to me, it seems a little bit premature to say that.”
There is certainly an industry for these devices. The market for adorable social robots is especially active in countries such as Japan. Companies like Lovot and Qoobo (“a tailed cushion that heats your heart”) have made cuddly, adorable companion bots en vogue. These devices have been utilized in Western countries as well, but there’s much less cultural acceptance for them. But the current tendency for companies to put generative AI into everything means everywhere is probably due for a wave of these conversational Chuckies.
“I think the industry is still trying to understand the market,” says Lilian Hung, an assistant professor and Research Chair in Senior Care at the University of British Columbia School of Nursing. “It’s still in its infancy, but it has certainly taken off.”
Aarian Marshall
Jaina Grey
Brenda Stolyar
Simon Hill
Social robot roommate Jibo initially caused a stir, but sadly didn’t live long.
Not that there haven’t been an array of other attempts. Jibo, a social robot roommate that used AI and endearing gestures to bond with its owners had its collective plug unceremoniously pulled just a few years after being put out into the world. Meanwhile, another US-grown offering, Moxie, an AI-empowered robot aimed at helping with child development, is still active.
It’s hard not to look at devices like this and shudder at the possibilities. There’s something inherently disturbing about tech that plays at being human, and that uncanny deception can rub people the wrong way. After all, our science fiction is replete with AI beings, many of them tales of artificial intelligence gone horribly wrong. The easy, and admittedly lazy, comparison to something like the Hyodol is M3GAN, the 2023 film about an AI-enabled companion doll that goes full murderbot.
But aside from offputting dolls, social robots come in many forms. They’re assistants, pets, retail workers, and often socially inept weirdos that just kind of hover awkwardly in public. But they’re also sometimes weapons, spies, and cops. It’s with good reason that people are suspicious of these automatons, whether they come in a fluffy package or not.
Wendy Moyle is a professor at the School of Nursing & Midwifery Griffith University in Australia who works with patients experiencing dementia. She says her work with social robots has angered people, who sometimes see giving robot dolls to older adults as infantilizing.
“When I first started using robots, I had a lot of negative feedback, even from staff,” Moyle says. “I would present at conferences and have people throw things at me because they felt that this was inhuman.”
However, the atmosphere around assistive robots has gotten less hostile recently, as they’ve been utilized in many positive use cases. Robotic companions are bringing joy to people with dementia. During the Covid pandemic, caretakers used robotic companions like Paro, a small robot meant to look like a baby harp seal, to help ease loneliness in older adults. Hyodol’s smiling dolls, whether you see them as sickly or sweet, are meant to evoke a similar friendly response.
Aarian Marshall
Jaina Grey
Brenda Stolyar
Simon Hill
Cute or creepy? Hyodol’s ChatGPT-enabled doll can interact with and monitor older adults.
Hyodol isn’t alone in its AI companion endeavors for older adults. ElliQ, an AI-enabled product made by the Israeli company Intuition Robotics has been used in trial programs for assisting older adults in New York. It’s less cuddly though, coming in the form of a lamp-like bulb that can sit on a nightstand. What Hyodol is aiming to do is combine that functionality with the fuzzy, feel-good form factor of the big-eyed Paro seal. (Hyodol did not respond to multiple requests for comment.)
The baby harp seal Paro robot is aimed at easing loneliness in older adults.
Even without AI in them, these pseudo-sentient companion dolls have garnered their share of concerns. Moyle, who has helped oversee studies of these devices in elder care, says, in some cases, people who depend on them for care can become too attached to the dolls.
Aarian Marshall
Jaina Grey
Brenda Stolyar
Simon Hill
“One of the negative aspects we had to contend with was some residents loved their doll so much, they were babies to them,” Moyle says. “They were babies that they could carry, they could have with them. Some small majority loved them so much that it became too much a part of their life. We had to try and reduce the amount of time that they were using them.”
ElliQ, a product enhanced with AI, has been trialled for helping elderly individuals in New York.
Embedding language skills in a companion doll, such as the hallucination-prone and odd ChatGPT, risks an unsettling effect. The same apprehensions that surround the incorporation of AI into other devices accompany AI-enabled dolls. Generative AI, such as ChatGPT, often generates false information and is open to multiple potential security risks, aside from the fact that all data from a ChatGPT interface is sent back to OpenAI. These devices also raise privacy and security issues, as with any tool that records a user and transmits that data. Over-reliance on a robot by caretakers for things such as reminding patients to take medications also adds practical failure points.
“Considerable effort is required to ensure that conversations with robots are safe,” says Hung. He emphasizes the need to ensure that robots do not induce individuals to do something unethical or gather any information, and that they do not ask elderly users for confidential data like credit card numbers.
The inherent risk when firms urge people to use their products during their weakest moments is a contentious issue, according to Moyle.
“If we give somebody an opportunity to talk to AI, does that remove all other social opportunities?” Moyle says. “Does that mean that families stop visiting? Does that mean that the staff stops talking to the person?” It’s a risk, but she says, in her experience, many older adults in care facilities are often left by themselves for the vast majority of their days and nights as it is. “Giving them something, if it makes them happy, is much better than giving them nothing at all.”
Aarian Marshall
Jaina Grey
Brenda Stolyar
Simon Hill
Of course these devices aren’t the same as a human. Large language models don’t understand the person interacting with them; they’re just very good at predicting what will sound like a good response. And they certainly don’t know how to fully understand a person’s emotions or mental state.
“People can be displaying quite challenging emotions that are not being picked up by the AI,” Moyle says. “As AI becomes more sophisticated, that probably will get better, but at the moment it’s certainly not.” She pauses, then laughs and adds, “But a lot of humans can’t assess emotions very well either, so…”
For lots of people, it doesn’t really matter if a robot can’t love them back. It’s why we still mourn our robots dying slow, somber deaths, and hold funerals for robot dogs. It’s why we want personalities in our sexbots and trust them with our deepest desires. When a human interacts with a robot, it’s less about whether the robot can love you back, and more so about how people derive value from the act of pouring their own feelings into someone (or something) else.
“What the cat and what the baby gave us is a sense that they need our love, and that’s what we are longing for as humans,” Hung says. If someone is looking to interact with a cute and cuddly robot, it’s often to fulfill that same function. “We buy these robots because we want to give our love to them—so we feel that the robot needs our love, so we feel that there’s something who needs us. That’s the nature of humans.”