Admit it: The last time you sat down with a physician and revealed your medical history, did you fudge a bit? Were there certain incidents you were too embarrassed to admit? Did you gloss over certain behaviors that might make you look bad?
It’s a serious problem for health professionals and patients alike. With less-complete information to work with, doctors are more likely to misdiagnose an illness, or prescribe an inappropriate drug.
Recently published research offers a possible solution to this problem: Virtual humans. In the journal Computers in Human Behavior, a research team reports patients are more comfortable discussing private matters with these computer-created entities, and this ease prompts them to disclose more information.
People disclosed information more honestly and openly when they were told they were speaking exclusively to the computer. The participants also “reported significantly lower fear of self-disclosure” under those circumstances.
“The power of VH (virtual human) interviewers to elicit more honest responding comes from the sense that no one is observing or judging,” note the researchers, led by Gale Lucas of the University of Southern California’s Institute for Creative Technologies. People have a strong tendency to want to look good in front of others, including doctors; this problematic tendency can be short-circuited using this high-tech tool.
Lucas and her colleagues conducted an experiment featuring 239 adults recruited online. In person at their laboratory, each participant interacted with a virtual human seen on a computer screen. The figure was programmed to develop rapport with people; when appropriate, it gave “verbal empathetic feedback” such as “I’m sorry to hear that.” It also conveyed “active and empathetic listening” via nods and expressions.
The figure conducted a semi-structured screening interview with each study participant, as if they were being admitted to a clinic or hospital. After a series of get-to-know-you questions (“Where are you from originally?”), it asked a series of questions about medical issues and specific symptoms.
Half of the participants were told that their conversation was entirely computer-driven and not being observed. The others were informed they were being watched by a person in another room who was also manipulating the machine to ask certain questions. In all cases, video images of their faces were recorded and later analyzed to gauge their level of emotional expression.
Afterwards, participants responded to a series of statements measuring their comfort level with the experience. Finally, an outside observer noted their responses to certain sensitive questions (such as “How close are you to your family?”) and gauged their willingness to disclose personal information.
The result: People disclosed information more honestly and openly when they were told they were speaking exclusively to the computer. The participants also “reported significantly lower fear of self-disclosure” under those circumstances. These results were reiterated by the analysis of their facial expressions, which found they “allowed themselves to display more intense expressions of sadness” when they believed no human was watching them.
So the perception of anonymity was the key. That conclusion was confirmed in several ways, including by noting the closing remarks of many participants. “This is way better than talking to a person,” one commented. “I don’t really feel comfortable talking about personal stuff to other people.”
When it comes to fixing our health-care system, very few people would agree that part of the answer lies in less human interaction. Patients generally want more, not less, contact with health professionals. Yet this study suggests that, at least for the intake interview, a little less of the human touch—and a little more perceived privacy—may be precisely what the doctor ordered.