{"616279":{"#nid":"616279","#data":{"type":"news","title":"\u0027Human Rights\u0027 May Help Shape Artificial Intelligence in 2019","body":[{"value":"\u003Cp\u003EEthics and accountability will be among the most significant challenges for artificial intelligence (AI) in 2019, according to a survey of researchers at Georgia Tech\u0026rsquo;s College of Computing.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn response to an email query about AI developments that can be expected in 2019, most of the researchers \u0026ndash; whether talking about \u003Ca href=\u0022http:\/\/ml.gatech.edu\/\u0022\u003Emachine learning\u003C\/a\u003E (ML), \u003Ca href=\u0022http:\/\/www.robotics.gatech.edu\/\u0022\u003Erobotics\u003C\/a\u003E, \u003Ca href=\u0022http:\/\/vis.gatech.edu\/\u0022\u003Edata visualizations\u003C\/a\u003E, \u003Ca href=\u0022https:\/\/gtnlp.wordpress.com\/\u0022\u003Enatural language processing\u003C\/a\u003E, or other facets of AI \u0026ndash; touched on the growing importance of recognizing the needs of people in AI systems.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;In 2019, I hope we will see AI researchers and practitioners start to frame the debate about proper and improper uses of artificial intelligence and machine learning in terms of human rights,\u0026rdquo; said Associate Professor \u003Ca href=\u0022http:\/\/eilab.gatech.edu\/mark-riedl\u0022\u003E\u003Cstrong\u003EMark Riedl\u003C\/strong\u003E\u003C\/a\u003E.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003E\u003Ca href=\u0022https:\/\/youtu.be\/o-YLQJ-oRqE\u0022 target=\u0022_blank\u0022\u003E[RELATED: Is AI Coming For My Job?]\u003C\/a\u003E\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;More and more, interpretability and fairness are being recognized as critical issues to address to ensure AI appropriately interacts with society,\u0026rdquo; said Ph.D. student\u0026nbsp;\u003Cstrong\u003E\u003Ca href=\u0022https:\/\/fredhohman.com\/\u0022\u003EFred Hohman\u003C\/a\u003E\u003C\/strong\u003E.\u003C\/p\u003E\r\n\r\n\u003Ch4\u003E\u003Cstrong\u003ETaking on algorithmic bias\u003C\/strong\u003E\u003C\/h4\u003E\r\n\r\n\u003Cp\u003EQuestions about the rights of end users of AI-enabled services and products are becoming a priority, but Riedl said more is needed.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Companies are making progress in recognizing that AI systems may be biased in prejudicial ways. [However,] we need to start talking about the next step: remedy. How do people seek remedy if they believe an AI system made a wrong decision?\u0026rdquo; said Riedl.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAssistant Professor \u003Ca href=\u0022http:\/\/jamiemorgenstern.com\/\u0022\u003E\u003Cstrong\u003EJamie Morgenstern\u003C\/strong\u003E\u003C\/a\u003E sees algorithmic bias as an ongoing concern in 2019 and gave banking as an example of an industry that may be in the news for its algorithmic decision-making.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;I project that we\u0026rsquo;ll have more high-profile examples of financial systems that use machine learning having worse rates of lending to women, people of color, and other communities historically underrepresented in the \u0026lsquo;standard\u0026rsquo; American economic system,\u0026rdquo; Morgenstern said.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003E\u003Ca href=\u0022https:\/\/www.cc.gatech.edu\/news\/615576\/georgia-tech-researchers-working-improve-fairness-ml-pipeline\u0022 target=\u0022_blank\u0022\u003E[RELATED:\u0026nbsp;Researchers Working To Improve Fairness in the ML Pipeline]\u003C\/a\u003E\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn recent years corporate responses to cases of bias have been hit or miss, but Assistant Professor \u003Ca href=\u0022http:\/\/www.munmund.net\/\u0022\u003E\u003Cstrong\u003EMunmun De Choudhury\u003C\/strong\u003E\u003C\/a\u003E said 2019 may see a shift in how tech companies balance their shareholders\u0026rsquo; interests with the interests of their customers and society.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;[Companies] will be increasingly subject to governmental regulation and will be forced to come up with safeguards to address misuse and abuse of their technologies, and will even consider broader partnerships with their market competitors to achieve this. For some corporations, business interests may take a backseat to ethics until they regain customer trust,\u0026rdquo; said De Choudhury.\u003C\/p\u003E\r\n\r\n\u003Ch4\u003E\u003Cstrong\u003EWorking toward more transparency\u003C\/strong\u003E\u003C\/h4\u003E\r\n\r\n\u003Cp\u003EOne way companies can regain that trust is through sharing their algorithms with the public, our experts said.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Developers tend to walk around feeling objective because \u0026lsquo;it\u0026rsquo;s the algorithm that is determining the answer\u0026rsquo;. Moving forward, I believe that the algorithms will have to be increasingly \u0026lsquo;inspectable\u0026rsquo; and developers will have to explain their answers,\u0026rdquo; Executive Associate Dean and Professor \u003Ca href=\u0022https:\/\/www.cc.gatech.edu\/fac\/Charles.Isbell\/\u0022\u003E\u003Cstrong\u003ECharles Isbell\u003C\/strong\u003E\u003C\/a\u003E\u003Cstrong\u003E.\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EPh.D. student\u0026nbsp;\u003Ca href=\u0022https:\/\/www.cc.gatech.edu\/~ypinter3\/\u0022\u003E\u003Cstrong\u003EYuval Pinter\u003C\/strong\u003E\u003C\/a\u003E agreed. In the coming year, \u0026ldquo;[I] think we will see that researchers are trying to [develop] techniques and tests that can help us to better understand what\u0026rsquo;s going on in the actual wiring of our very fancy machine learning models.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;This is not only for curiosity but also because legal applications or regulation in various countries are starting to require that algorithmic decision-making programs be able to explain why they are doing what they are doing,\u0026rdquo; said Pinter.\u003C\/p\u003E\r\n\r\n\u003Cp\u003ERegents\u0026rsquo; Professor \u003Ca href=\u0022https:\/\/www.cc.gatech.edu\/aimosaic\/faculty\/arkin\/\u0022\u003E\u003Cstrong\u003ERon Arkin\u003C\/strong\u003E\u003C\/a\u003E believes that these concerns are becoming more central precisely because artificial intelligence will continue to grow in importance in our everyday lives.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003E\u003Ca href=\u0022https:\/\/www.ic.gatech.edu\/podcasts\/ep-1-pt-1-whos-behind-wheel\u0022 target=\u0022_blank\u0022\u003E[RELATED: Who\u0026#39;s Behind the Wheel?]\u003C\/a\u003E\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Despite continued hype and omnipresent doomsayers, panic and fear over the growth of AI and robotics should begin to subside in 2019 as the benefits to people\u0026rsquo;s lives are becoming more apparent to the world.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;However, I expect to see lawyers jumping into the fray so we may also see lawsuits determining policy for self-driving cars [and other applications] more so than government regulation or the legal system,\u0026rdquo; said Arkin.\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":[{"value":"Georgia Tech experts highlight need to address bias and transparency in ongoing debate about role of AI"}],"field_summary":"","field_summary_sentence":[{"value":"Georgia Tech researchers say ethics and transparency are likely top 2019 trends in the burgeoning field of AI."}],"uid":"32045","created_gmt":"2019-01-11 20:36:29","changed_gmt":"2019-01-25 15:27:43","author":"Ben Snedeker","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2019-01-15T00:00:00-05:00","iso_date":"2019-01-15T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"hg_media":{"616435":{"id":"616435","type":"image","title":"GT Computing 2019 AI Predictions","body":null,"created":"1547573803","gmt_created":"2019-01-15 17:36:43","changed":"1547573803","gmt_changed":"2019-01-15 17:36:43","alt":"GT Computing 2019 AI Predictions","file":{"fid":"234636","name":"Predictions rotator_final main.png","image_path":"\/sites\/default\/files\/images\/Predictions%20rotator_final%20main.png","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/Predictions%20rotator_final%20main.png","mime":"image\/png","size":176681,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/Predictions%20rotator_final%20main.png?itok=SSjn7fVQ"}}},"media_ids":["616435"],"groups":[{"id":"47223","name":"College of Computing"},{"id":"545781","name":"Institute for Data Engineering and Science"},{"id":"576481","name":"ML@GT"},{"id":"50877","name":"School of Computational Science and Engineering"},{"id":"50875","name":"School of Computer Science"},{"id":"50876","name":"School of Interactive Computing"},{"id":"1299","name":"GVU Center"}],"categories":[],"keywords":[{"id":"2556","name":"artificial intelligence"},{"id":"9167","name":"machine learning"},{"id":"180204","name":"algorithmic bias"},{"id":"2947","name":"transparency"},{"id":"180205","name":"riedl"},{"id":"180206","name":"hohman"},{"id":"175631","name":"isbell"},{"id":"180207","name":"de choudhury"},{"id":"180208","name":"morgenstern"},{"id":"180209","name":"arkin"},{"id":"180210","name":"2019 trends"}],"core_research_areas":[{"id":"39501","name":"People and Technology"},{"id":"39521","name":"Robotics"},{"id":"39541","name":"Systems"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EAlbert Snedeker, Communications Manager\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Ca href=\u0022mailto:albert.snedeker@cc.gatech.edu?subject=2019%20AI%20Predictions\u0022\u003Ealbert.snedeker@cc.gatech.edu\u003C\/a\u003E\u003C\/p\u003E\r\n","format":"limited_html"}],"email":["albert.snedeker@cc.gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}