How does a robot know that a human is not feeling comfortable at all? I suppose an entire bunch of sensors as well as the right algorithms will be able to do the job, but there is a recent University of Auckland study that showed how a preference for humanlike features on a robot’s display screen might change the way robots work in the future. Around 60% of the participants prefered a robot that displayed the most humanlike skin-colored 3D virtual face, compared to a robot that lacks a face, while a robot that has silver-colored simplified human features on its “face” stand at a 10% approval rate. I guess this means that future robots that are being developed for healthcare as well as home care ought to feature a face that is as close to that of a human as possible in order to build a “bridge” of sorts between human-robot interaction.
Dr Elizabeth Broadbent from the University’s Department of Psychological Medicine, shared, “It’s important for robot designers to know how to make robots that interact effectively with humans, so that people feel comfortable interacting with the robots. One key dimension is robot appearance and how humanlike the robot should be.” Would you mind if future robots were like Sonny in I, Robot?
. Read more about