We strive to create robots that share human characteristics in order to make them socially acceptable. It also means our creations cannot escape gender stereotyping.
It would be good to imagine a future in which gender stereotyping no longer limited the aspirations of either men or women. But it seems even when we are able to hand over some of the arduous chores to robots there will still be a clear distinction between the roles appropriate for a 'male' robot and those that should be carried out by a 'female' robot.
Researchers at the University of Bielefeld designed an experiment to investigate how facial cues affected perceptions of the a robot's characteristics and what tasks it would be suitable for.
The research, which uses Flobi, an anthropomorphic robot head developed at the University of Bielefeld, is a collaboration between Dr Frank Hegel of the department of Applied Informatics and Professor Friederike Eyssel of CITEC (Center of Excellence Cognitive Interaction Technology). Only two changes were made to the head's appearance, hair length and a slight difference in the shape of the lips. This video introduces Flobi and the simple ways in which these transformation were made:
The computerized experiment involved showing the two sets of robots heads to 60 participants who were told that the research was on attitudes to "modern technologies of the future".
As predicted the researchers found that hair length was sufficient for making the distinction between being "agentic" (short-hair/male) and "communal" (long-hair/female). The male robot was perceived as more suitable for tasks such as repairing technical devices and guarding a house) while the female robot was perceived as more suitable for tasks related to household and care services.
While the researchers are not surprised that their findings confirm the prevalence of gender stereotyping, they at least do make the suggestion that perhaps robots could be an opportunity to challenge such prejudices.
Taken together, our research shows that people apply gender stereotypes that typically characterize human–human social cognitive processes to robots. Even though the trait attributions may be interpreted in terms of anthropomorphism, our findings also document that gender stereotypes seem to be so deeply ingrained that people even applied them to machines with a male or a female appearance.
Given such findings, issues related to the ethics of robot design inevitably arise: Should gender stereotypes be used and exploited in the design of robots or virtual agents to manipulate the user’s mental models? Or should designers instead construct counterstereotypical machines (e.g., female service robots to help a mechanic, male CareBots)? These questions seem to be worth considering by developers and designers of robots because of their social and societal consequences.
The idea is that robots could lead the way for their human counterparts to avoid gender imposed restrictions. Although it seems a roundabout route to achieving gender equality it is a step in the right direction.
Eyssel, F.A., & Hegel, F., 2012. (S)he's got the look: Gender-stereotyping of social robots. DOI:10.1111/j.15559-1816.2012.00937.x, published online July 27, 2012. Journal of Applied Social Psychology.
IPython 3.0 has been released with improved support for languages other than Python. Looking forward to the next release the project will be split with its language agnostic components moving to Pro [ ... ]