{"635398":{"#nid":"635398","#data":{"type":"news","title":"People Think Robots Are Pretty Incompetent and Not Funny, New Study Says","body":[{"value":"\u003Cp\u003EDang robots are crummy at so many jobs, and they tell lousy jokes to boot. In two new studies, these were common biases human participants held toward robots.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe studies were originally intended to test for gender bias, that is, if people thought a robot believed to be female may be less competent at some jobs than a robot believed to be male and vice versa. But researchers at the Georgia Institute of Technology discovered no significant sexism against the machines.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;This did surprise us. There was only a very slight difference in a couple of jobs but not significant. There was, for example, a small preference for a male robot over a female robot as a package deliverer,\u0026rdquo; said\u0026nbsp;\u003Cstrong\u003EAyanna Howard\u003C\/strong\u003E, the principal investigator on both studies. Howard is a\u0026nbsp;\u003Ca href=\u0022https:\/\/www.ic.gatech.edu\/people\/ayanna-howard\u0022\u003Eprofessor in and the chair of Georgia Tech\u0026rsquo;s School of Interactive Computing\u003C\/a\u003E.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAlthough robots are not sentient, as people increasingly interface with them, we begin to humanize the machines. Howard studies what goes right as we integrate robots into society and what goes wrong, and much of both has to do with how the humans feel around robots.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003EI hate robots\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Surveillance robots are not socially engaging, but when we see them, we still may act like we would when we see a police officer, maybe not jaywalking and being very conscientious of our behavior,\u0026rdquo; said Howard, who is also\u0026nbsp;\u003Ca href=\u0022https:\/\/www.ece.gatech.edu\/faculty-staff-directory\/ayanna-maccalla-howard\u0022\u003ELinda J. and Mark C. Smith Chair and Professor in Bioengineering in Georgia Tech\u0026rsquo;s School of Electrical and Computer Engineering\u003C\/a\u003E.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Then there are emotionally engaging robots designed to tap into our feelings and work with our behavior. If you look at these examples, they lead us to treat these robots as if they were fellow intelligent beings.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIt\u0026rsquo;s a good thing robots don\u0026rsquo;t have feelings because what study participants lacked in gender bias they more than made up for in judgments against robot competence. That predisposition was so strong that Howard wondered if it may have overridden any potential gender biases against robots \u0026ndash; after all, social science studies have shown that gender biases are still prevalent with respect to human jobs, even if implicit.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn questionnaires, humanoid robots introduced themselves via video to randomly recruited online survey respondents, who ranged from their twenties to their seventies and were mostly college educated. The humans ranked robots\u0026rsquo; career competencies, only trusting the machines to competently perform a handful of simple jobs.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003EPass the scalpel\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;The results baffled us because the things that people thought robots were less able to do were things that they do well. One was the profession of surgeon. There are\u0026nbsp;\u003Ca href=\u0022https:\/\/www.davincisurgery.com\/procedures\/gynecology-surgery\u0022\u003EDa Vinci robots that are pervasive in surgical suites\u003C\/a\u003E, but respondents didn\u0026rsquo;t think robots were competent enough,\u0026rdquo; Howard said. \u0026ldquo;Security guard \u0026ndash; people didn\u0026rsquo;t think robots were competent at that, and there are companies that specialize in great robot security.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECumulatively, the 200 participants across the two studies thought robots would also fail as nannies, therapists, nurses, firefighters, and totally bomb as comedians. But they felt confident bots would make fantastic package deliverers and receptionists, pretty good servers, and solid tour guides.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe researchers could not say where the competence biases originate. Howard could only speculate that some of the bad rap may have come from media stories of robots doing things like falling into swimming pools or injuring people.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003EIt\u0026rsquo;s a boy\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EDespite the lack of gender bias, participants readily assigned genders to the humanoid robots. For example, people accepted gender prompts by robots introducing themselves in videos.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIf a robot said, \u0026ldquo;Hi, my name is James,\u0026rdquo; in a male-sounding voice, people mostly identified the robot as male. If it said, \u0026ldquo;Hi, my name is Mary,\u0026rdquo; in a female voice, people mostly said it was female.\u003C\/p\u003E\r\n\r\n\u003Cp\u003ESome robots greeted people by saying \u0026ldquo;Hi\u0026rdquo; in a neutral sounding voice, and still, most participants assigned the robot a gender. The most common choice was male followed by neutral then by female. For Howard, this was an important take-away from the study for robot developers.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Developers should not force gender on robots. People are going to gender according to their own experiences. Give the user that right. Don\u0026rsquo;t reinforce gender stereotypes,\u0026rdquo; Howard said.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003ESocial is good\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EGendering is only one aspect of robot humanization, something we do altogether too much, Howard said. Some in her field advocate for not building robots in humanoid form at all in order to discourage it, but Howard did not take it that far.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Robots can be good for social interaction. They could be very helpful in elder care facilities to keep people company. They might also make better nannies than letting the TV babysit the kids,\u0026rdquo; said Howard, who also defended robots\u0026rsquo; comedic talent, provided they are programmed for that.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;If you ever go to an amusement park, there are animatronics that tell really good jokes.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003ERead the studies\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe two studies were submitted to conferences that were canceled due to COVID-19.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EWhy Should We Gender? The Effect of Robot Gendering and Occupational Stereotypes on Human Trust and Perceived Competency was published in\u0026nbsp;\u003Ca href=\u0022https:\/\/doi.org\/10.1145\/3319502.3374778\u0022\u003E\u003Cem\u003EProceedings of 2020 ACM Conference on Human-Robot Interaction (HRI\u0026rsquo;20)\u003C\/em\u003E\u003C\/a\u003E, which appeared in March 2020. Robot Gendering: Influences on Trust, Occupational Competency, and Preference of Robot Over Human appeared in\u0026nbsp;\u003Cem\u003ECHI 2020 Extended Abstracts\u0026nbsp;\u003C\/em\u003E(computer-human interaction, DOI: 10.1145\/3334480.3382930).\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe research was funded by the National Science Foundation and by the Alfred P. Sloan Foundation.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cem\u003EThe papers\u0026rsquo; coauthors were De\u0026rsquo;Aira Bryant, Kantwon Rogers, and Jason Borenstein from Georgia Tech. The National Science foundation funded via grant 1849101. The Alfred P. Sloan Foundation funded via grant G-2019-11435. Any findings, conclusions, or recommendations are those of the authors and not necessarily of the sponsors.\u003C\/em\u003E\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"Two new studies from the College of Computing look for gender bias in people\u0027s perceptions of robots."}],"uid":"32045","created_gmt":"2020-05-15 16:49:17","changed_gmt":"2020-05-15 16:50:43","author":"Ben Snedeker","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2020-05-15T00:00:00-04:00","iso_date":"2020-05-15T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"635399":{"id":"635399","type":"image","title":"Robot bias study","body":null,"created":"1589561386","gmt_created":"2020-05-15 16:49:46","changed":"1589561386","gmt_changed":"2020-05-15 16:49:46","alt":"","file":{"fid":"241788","name":"Robot-intros-768x594.jpg","image_path":"\/sites\/default\/files\/images\/Robot-intros-768x594.jpg","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/Robot-intros-768x594.jpg","mime":"image\/jpeg","size":77406,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/Robot-intros-768x594.jpg?itok=QqZQnOwo"}}},"media_ids":["635399"],"groups":[{"id":"47223","name":"College of Computing"}],"categories":[],"keywords":[],"core_research_areas":[{"id":"39521","name":"Robotics"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EBen Brumfield, Sr. Science Writer\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Ca href=\u0022mailto:ben.brumfield@comm.gatech.edu?subject=Robot%20Bias\u0022\u003Eben.brumfield@comm.gatech.edu\u003C\/a\u003E\u003C\/p\u003E\r\n","format":"limited_html"}],"email":["ben.brumfield@comm.gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}