The university of Plymouth
Forget the peer pressure, the future generations are more likely to be influenced by the robots, a study suggests.
The research, conducted at the University of Plymouth, found that while adults were not overwhelmed by the robots, the children were.
The fact that the children tended to rely on robots without question raised ethical issues as the machines became more pervasive, the researchers said.
A call was made to the robotics community to build in safeguards for children.
The participants in the study completed a simple test, known as the Asch paradigm, which involved the search for two lines that match in length.
Known as the conformity experiment, the test it has been found that people tend to agree with their colleagues, even if you individually have given a different answer.
In this case, the companions were robots. When children between the ages of seven to nine were alone in the room, it scored an average of 87% on the test.
But when the robots joined them, their scores dropped to 75% in average. Of the incorrect responses, 74% corresponds to those of the robots.
The university of Plymouth
Professor of robotics, Tony Belpaeme, who led the research, said: “People often follow the opinions of others, and we have known for a long time that it is difficult to resist taking more points of view and opinions of the people around us. We know this as the conformity. But as robots will soon be in the home and the workplace, we wondered if the people conform to the robots.
“What our results show is that the adults do not conform to what the robots are saying. But when we did the experiment with the children, they did. It is shown that the children perhaps may have more of an affinity with the robots of the adults, which raises the question: what would happen if robots were to suggest, for example, what products to buy or what to think?”
The conclusion? Children increasingly yielded to the social pressure exerted by a group of robots; however, the adults refused to be influenced by our robots.”
The researchers said that it was necessary to follow the discussions about the protection measures to “minimize the risk to the children during the social child-robot interaction”.
Professor Noel Sharkey, who chairs the Foundation for the Responsible of the Robotics, said of the research: “This study reinforces the concerns about the use of robots with children.
“If robots can convince the children (but not adults) the false information is true, the implications of the projected commercial exploitation of the robots for childminding and teaching is problematic.”
But he added: “One of the missing components of the studies was testing the children with a voice from a computer. This means that we can not know if the effect has nothing to do with robots or simply the voices played through them.”