As artificial intelligence continues to advance, it's changing how we think about human-robot interactions. But how do humans respond to robots expressing emotions? Can robots convey emotions visually? These questions led researchers to study how humans respond to robot expressions –– and they used Miko as their benchmark.
Read on for the key takeaways from the study, which was a collaboration between Miko and a group of esteemed international researchers.
Glad you asked!
Several studies have shown humans respond more positively to robots that acknowledge the emotional state of the human and respond accordingly. In this study, researchers asked participants to identify basic human expressions on two robots: a hybrid-face robot and Miko. The results were published in the Institute of Electrical and Electronics Engineers Internet of Things Journal.
The study's authors included Miko's Co-Founder and CEO, Sneh Vaswani; Miko Co-Founder and CTO, Prashant Iyengar; as well as researchers from Imperial College London; UC at Santa Barbara; University of Bristol; Radboud University; and Friedrich Alexander University.
In the study, the participants viewed a variety of facial expressions in a random order — first from a hybrid-face robot and then from Miko. When the participants saw the hybrid-face robot expressions, they were 80% successful in choosing the intended emotions. This illustrates how well the hybrid-face robot was able to convey those feelings to humans.
The researchers also studied the brain waves of the participants during the exercise. They found that the participants exhibited face-sensitive brain activity while watching the expression on the robot’s face.
When the researchers conducted the same exercise with Miko, participants were able to identifymore than 90% of the bot’s emotional expressions.
Participants had higher level of recognition for Miko's “happy,” “angry” and “sad” expressions when compared with the hybrid-face robot — and had significantly higher recognition of Miko's “afraid” expression. The participants' brain waves responded similarly to Miko’s “emotions” as during the hybrid-face portion of the study.
Researchers concluded that Miko, even with simplified facial expressions, was successfully able to convey emotions — and that enriched the human/robot relationship.
In today's world, far too many interactions between kids and technology are one-sided. The research on the effects of this has varied, but it seems to indicate that excessive passive screen time can be negative, while meaningful interaction with technology has educational and social benefits. This is especially the case when parents and kids interact with technology together.
Miko offers an interactive, conversational, engaging interaction for your child. This experience is not only enhanced by the fact that Miko can successfully display human expressions and emotions, but also because Miko is constantly updated with fresh, new interactive experiences.
Miko engineers are using the latest advancements in artificial intelligence and machine learning to enhance your family’s Miko experience. Our goal is simple: to continue helping kids around the world discover the spark of interactive, playful learning.