It’s impossible today to read any news from the tech industry without seeing something about the rise of artificial intelligence. On the other hand, news of robotics is just starting to pick up steam. In fact, healthcare is one of only a few sectors where we’re beginning to see very real and tangible evidence of its use. Several studies show the benefits of robotics in diagnostic and data-managing responsibilities, and some models have already been built and tested. At the same time, acknowledging this forces us to question what a trained, human doctor’s responsibilities might become, should a wide-spread adoption of such technology come to fruition.
Humans are naturally responsive to simple robots
In the military, robots are employed as a buffer between life and death, most commonly as bomb disposal units. Dr. Julie Carpenter, a leading expert on human-robot social interactions, discusses in a forbes.com interview that “There is a very clear awareness by the people using them that these robots are machines and tools. At the same time, that doesn’t prevent some level of social or pseudo-social interaction with the robots.” Carpenter discusses how soldiers will often develop emotional attachments to the robots, and even project an extension of themselves upon it like a video game avatar. In medical contexts, an article discusses a University of Southern California study that suggests subjects are more willing to reveal personal information to sociable humanoid robots than to actual human healthcare providers. One participant described talking to the robot, named SimSensei, as “way better than talking to a person,” as they “don’t really feel comfortable talking about personal stuff to other people.”
There are issues robots are primed to fix
It is a pretty disturbing data point to come to grips with, but estimates claim the US wastes $300 billion a year in prescribed medications that are never taken. Looking to solve that is Catalia Health, who raised $2.5 million in seed funding for their admittedly cute robot, Mabu, to have conversations with patients and gently nudge them to take their medications. We might also look to a Mayo Clinic report, which suggests doctors cannot match a computer when it comes to recording medical history with accuracy. The report claims doctors miss a whopping 55% of psychosocial problems and 45% of patient concerns—a simple issue primed for AI-powered or robotic data management and symptom recognition. It’s also public knowledge that therapists now commonly use digital mediums and social humanoid robots to help patients work through social anxiety, PTSD, agoraphobia, and addiction issues; and that surgeons are increasingly utilizing robotics for furthered accuracy.
However, there are crucial responsibilities robots can’t fulfill
As we discussed in our past article about AI, a variable we have to take seriously in the rise of robotics is that demanded human skills will likely shift to creativity, empathy, information synthesis and creative problem solving. We’ve also discussed how crucial non-verbal behavior is in person-to-person interactions between patients and doctors. Doctors are needed for difficult interactions such as breaking bad news with tact, and appealing to the emotional wellbeing of patients and their families in general. A study shows a patient in agreement with their doctor about their treatment process is strongly associated with their recovery, suggesting an ability for discussion is crucial as well—something AI robotics are eons from matching us in. There’s further evidence that good communication in general can improve a patient’s compliance and overall satisfaction.
All in all, it might be wise to admit that AI and robotics have a worthy place in the medical field, but might need to be kept to more menial tasks and not go beyond mere facilitation. We should hope that data management, pattern recognition diagnosis, reminders, scheduling, and simple sociable help or emotional aids are automated by synthetic intelligence. When it comes to doctors, however, their responsibilities will likely shift to equally crucial roles, such as more complex discussion, handling patient and family emotions, managing challenging information, weighing possibilities, and navigating unique circumstances. Healing will always be an experience first and a data point second, and it needs to stay that way, no matter what technology joins the daily rounds.