{"668644":{"#nid":"668644","#data":{"type":"news","title":"Researchers Highlight Ethical Issues for Developing Future AI Assistants","body":[{"value":"\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EMost people use voice assistant technologies like Alexa or Google Assistant for list making and quick weather updates. But imagine if these technologies could do much more \u2014 summarize doctor\u2019s appointments, remind someone to take their medicines, manage their schedule (knowing which events take priority), and not only read a recipe but also create reminders to shop for ingredients \u2014 without the user having to prompt it. If a smart assistant could use artificial intelligence to take away some of the cognitive load for common tasks, it could help older adults preserve their independence and autonomy.\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003ENext-generation smart assistants aren\u2019t on the market yet, but the research necessary to create them is underway now. This includes efforts to develop smart assistants that are proactive \u2014that is, the system could anticipate the user\u2019s wants and needs, and even assist and mediate social interactions between users and their support networks. But with the design of systems that seek to enhance the abilities of older adults as they experience cognitive decline, a broad range of ethical issues arises. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EResearchers from the NSF \u003Ca href=\u0022https:\/\/www.ai-caring.org\/\u0022\u003EAI Institute for Collaborative Assistance and Responsive Interaction for Networked Groups (AI-CARING)\u003C\/a\u003E saw a need to outline some of these issues up front, with the hope that designers will consider them when developing the next generation of smart assistants. The team\u2019s article, \u201c\u003Ca href=\u0022https:\/\/ieeexplore.ieee.org\/document\/10017383\u0022\u003EEthical Issues in Near-Future Socially Supportive Smart Assistants for Older Adults\u003C\/a\u003E,\u201d was published in the journal \u003Cem\u003EIEEE Transactions on Technology and Society\u003C\/em\u003E. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u201cWe\u0027re trying to provide a landscape of the ethical issues designers need to take into account long before advanced smart assistant systems show up in a person\u2019s home,\u201d said \u003Ca href=\u0022https:\/\/spp.gatech.edu\/people\/person\/79e785b1-0bad-5022-9bee-7126ced2c846\u0022\u003EJason Borenstein\u003C\/a\u003E, professor of ethics and director of Graduate Research Ethics Programs in the \u003Ca href=\u0022https:\/\/spp.gatech.edu\/\u0022\u003ESchool of Public Policy\u003C\/a\u003E and the Office of Graduate and Postdoctoral Education at Georgia Tech. \u201cIf designers don\u0027t think through these issues, then a family might set a relative up with a system, go home, and trust that their relative is safe and secure when they might not be.\u201d\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EAccording to the AI-CARING researchers, when a person relies on an AI system, that person becomes vulnerable to the system in unique ways. For people with age-related cognitive impairment who might use the technology for complicated forms of assistance, the stakes get even higher, with vulnerability increasing as their health declines. Systems that fail to perform correctly could put an older adult\u2019s welfare at significant risk.\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u201cIf a system makes a mistake when you\u2019ve relied on it for something benign \u2014 like helping you choose the movie you\u2019re going to watch \u2014 that\u2019s not a big deal,\u201d said \u003Ca href=\u0022https:\/\/www.cmu.edu\/dietrich\/philosophy\/people\/faculty\/london.html\u0022\u003EAlex John London\u003C\/a\u003E, lead author of the paper and K\u0026amp;L Gates Professor of Ethics and Computational Technologies at Carnegie Mellon University. \u201cBut if you\u2019ve relied on it to remind you to take your medicine, and it doesn\u2019t remind you or tells you to take the wrong medicine, that would be a big problem.\u201d \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EAccording to the researchers, to develop a system that truly prioritizes the user\u2019s well-being, designers should consider issues such as trust, reliance, privacy, and a person\u2019s changing cognitive abilities. They should also make sure the system supports the user\u2019s goals rather than the goals of an outside party such as a family member, or even a company that might seek to market products to the user. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EA system like this would require a nuanced and constantly evolving model of the user and their preferences, incorporating data from a variety of different sources. For a smart assistant to effectively do its job, it might need to share some of the main user\u2019s information with other entities, which can expose the user to risk. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EFor example, a user might want the physician\u2019s office to know that they would like a doctor\u2019s appointment. But depending on the person, they may not want that information shared with their children, or only with one child and not another. According to the researchers, designers should consider methods of sharing personal information that also uphold the user\u2019s ability to control it. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EOver trust and under trust of the system\u2019s abilities are also important issues to consider. Over trust occurs when people project onto a technology abilities that it doesn\u2019t have, which could put them at risk when the system fails to deliver in a way they anticipated. Under trust can be an issue as well, because if a system can help a person with an important task and the person chooses not to use the system, they also could be left without help. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u201cThe goal of our analysis is to point out challenges for creating truly assistive AI systems so that they can be incorporated into the design of AI from the beginning,\u201d London said. \u201cThis can also help stakeholders create benchmarks for performance that reflect these ethical requirements rather than trying to address ethical issues after the system has already been designed, developed, and tested.\u201d\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EAccording to Borenstein, when smart assistants are created and introduced into homes, the primary user\u2019s well-being and goals should be the foremost concern.\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u201cDesigners are certainly well-intended, but all of us can benefit from the exchange of ideas across disciplines, and from talking with people with different perspectives on these kinds of technologies,\u201d Borenstein said. \u201cThis is just one piece of that puzzle that can hopefully inform the design process.\u201d\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003ECitation\u003C\/strong\u003E: A. J. London, Y. S. Razin, J. Borenstein, M. Eslami, R. Perkins and P. Robinette, \u0022\u003Ca href=\u0022https:\/\/ieeexplore.ieee.org\/document\/10017383\u0022\u003EEthical Issues in Near-Future Socially Supportive Smart Assistants for Older Adults\u003C\/a\u003E,\u0022 in\u0026nbsp;\u003Cem\u003EIEEE Transactions on Technology and Society\u003C\/em\u003E. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003EDOI\u003C\/strong\u003E: 10.1109\/TTS.2023.3237124\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EGeorgia Tech is bringing together the finest minds and voices to explore artificial intelligence \u2014 the opportunities, the risks, and above all the ethical and responsible stewardship of AI. To see our presenters and register to attend Avant South on Sept. 28 \u2013 29, visit \u003Ca href=\u0022https:\/\/avantsouth.com\/\u0022\u003Eavantsouth.com\u003C\/a\u003E.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EResearchers from AI-CARING\u0026nbsp;outline the ethical issues up front, with the hope that designers will consider them when developing the next generation of smart assistants. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"With the design of AI systems that seek to enhance the abilities of older adults as they experience cognitive decline, a broad range of ethical issues arises. "}],"uid":"36123","created_gmt":"2023-07-31 22:47:37","changed_gmt":"2023-08-30 13:17:30","author":"Catherine Barzler","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2023-07-31T00:00:00-04:00","iso_date":"2023-07-31T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"671290":{"id":"671290","type":"image","title":"gettyimages-1288932957-170667a.jpg","body":"\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003ENext-generation smart assistants will likely be designed to anticipate a user\u2019s wants and needs, and even assist and mediate social interactions between users and their support networks. \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n","created":"1690843901","gmt_created":"2023-07-31 22:51:41","changed":"1690843901","gmt_changed":"2023-07-31 22:51:41","alt":"An elderly woman with short white hair smiles and looks at a smart speaker system.","file":{"fid":"254320","name":"gettyimages-1288932957-170667a.jpg","image_path":"\/sites\/default\/files\/2023\/07\/31\/gettyimages-1288932957-170667a.jpg","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/2023\/07\/31\/gettyimages-1288932957-170667a.jpg","mime":"image\/jpeg","size":101781,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2023\/07\/31\/gettyimages-1288932957-170667a.jpg?itok=KxniF5Ae"}}},"media_ids":["671290"],"groups":[{"id":"1214","name":"News Room"}],"categories":[],"keywords":[{"id":"187915","name":"go-researchnews"}],"core_research_areas":[],"news_room_topics":[{"id":"71881","name":"Science and Technology"},{"id":"71901","name":"Society and Culture"}],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003ECatherine Barzler, Senior Research Writer\/Editor\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Ca href=\u0022mailto:catherine.barzler@gatech.edu\u0022\u003Ecatherine.barzler@gatech.edu\u003C\/a\u003E\u003C\/p\u003E\r\n","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}