{"667347":{"#nid":"667347","#data":{"type":"news","title":"Examining the Boundaries of Using AI \u0027Sensing\u0027 to Understand Office Workers\u2019 Performance and Wellbeing","body":[{"value":"\u003Cp\u003E\u003Cem\u003ENew research findings show that social acceptability and select sharing of AI results in the workplace are key to future implementation\u003C\/em\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECommercial monitoring tools are being introduced in offices alongside newer modes of work \u2013 screen meetings, remote collaboration, digital-first workflows \u2013 as a way for employers to better understand performance of their workforces.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EResearchers at Georgia Tech and Northeastern University conducted a study with information workers to learn about their perspectives on being monitored and their information being collected with passive-sensing enabled artificial intelligence (PSAI), where computing devices can unobtrusively detect and collect user behaviors. That information could then be used to train machine learning models that infer performance and wellbeing of workers.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cWe wanted to take a closer look at how workers perceive passive-sensing AI in order to make this technology work for the workers, as opposed to making them work for the technology,\u201d said\u0026nbsp;\u003Cstrong\u003EVedant Das Swain\u003C\/strong\u003E, lead researcher and a Ph.D. candidate in computer science at Georgia Tech.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EHe says there is an organizational need \u2013 for both employer and employee alike \u2013 to get better insights.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cOne of the underlying subtexts of the research is that there are these asymmetries at work because the employee doesn\u2019t have as much power as the employer. And if these technologies keep progressing as they are, this gap is going to widen because the employer will just keep getting more and more worker information.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EResearchers found that some technologies \u2013 fitness trackers and web cams, for example \u2013 used for personal activities may not translate well to work life if they are implemented without considering new norms of work. Technologies can now \u201cbreach physical boundaries,\u201d as Das Swain puts it, and using a web cam for work while at home might involve extra setup to close doors and blur backgrounds on the screen. Workers also want careful consideration of the context in which devices can gain information.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EWork devices monitoring worker activity is appropriate in many cases but work-related apps on personal devices might be a tougher sell.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe research results fall in two primary categories:\u003C\/p\u003E\r\n\r\n\u003Cul\u003E\r\n\t\u003Cli\u003E\u003Cstrong\u003EAppropriateness\u003C\/strong\u003E\u0026nbsp;\u2013 Understanding socially acceptable data to collect with passive-sensing AI and acceptable circumstances to infer worker performance and wellbeing.\u003C\/li\u003E\r\n\t\u003Cli\u003E\u003Cstrong\u003EDistribution\u003C\/strong\u003E\u0026nbsp;\u2013 Determining\u0026nbsp;what to share about worker data \u2013 and when\u0026nbsp;\u2013\u0026nbsp;with other stakeholders and the methods used.\u003C\/li\u003E\r\n\u003C\/ul\u003E\r\n\r\n\u003Cp\u003ERegarding the appropriateness aspect, Das Swain says that people in general don\u2019t want to feel dehumanized by algorithms. His team\u2019s work takes that idea further by learning about the mental models different workers use to determine what\u2019s appropriate for using PSAI.\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cDifferent workers have different ideas of what\u2019s insightful,\u201d he said. \u201cFor example, if I don\u2019t talk to my supervisor about my personal life, why should this machine be sensing that type of information? The alternative viewpoint is that I already know what I\u2019m doing at work, so give me more data. I could use sleep and commute data to infer how those activities might affect my work.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EDas Swain says there is no one-size-fits-all solution.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cAnd it\u2019s not just about privacy, it\u2019s about utility,\u201d he said. \u201cPeople find utility in different things. Some want more precise information in a work context, and some might want the holistic view of the data, in both cases to find insights for themselves.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe second category of results \u2013 distribution \u2013 is no less tricky. Worker information is ostensibly personal in nature, but collaborative and performance measures at work necessitate the sharing of this information.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe researchers found that participants strongly felt that if a machine predicted something related to performance or wellbeing, then they should have enough time to make changes and provide context, such as if a worker is on paternity leave and must alter project deadlines.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cOnly at a later point, if at all, can the data be escalated to someone else to help as the situation requires,\u201d said Das Swain. \u201cThat was very clear in the study.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EOne red flag so to speak, for Das Swain as a researcher, is that these technologies don\u2019t afford users any control to understand newer types of personal data that are being collected and stored at work.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EWith algorithmic uncertainty now at the forefront of many conversations, Das Swain views these results from the Georgia Tech and Northeastern group as tangible guideposts for regulators and companies making decisions around public and commercial deployment of AI sensing tech for information workers.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe published results will be presented at the ACM CHI Conference on Human Factors in Computing Systems, taking place April 23-28, in Hamburg, Germany. The academic paper,\u0026nbsp;\u003Ca href=\u0022https:\/\/programs.sigchi.org\/chi\/2023\/program\/content\/95708\u0022\u003E\u003Cem\u003EAlgorithmic Power or Punishment: Information Worker Perspectives on Passive Sensing Enabled AI Phenotyping of Performance and Wellbeing\u003C\/em\u003E\u003C\/a\u003E, is co-authored by Das Swain,\u0026nbsp;\u003Cstrong\u003ELan Gao\u003C\/strong\u003E,\u0026nbsp;\u003Cstrong\u003EWilliam Wood\u003C\/strong\u003E,\u0026nbsp;\u003Cstrong\u003ESrikruthi C. Matli\u003C\/strong\u003E,\u0026nbsp;\u003Cstrong\u003EGregory Abowd\u003C\/strong\u003E, and\u0026nbsp;\u003Cstrong\u003EMunmun De Choudhury\u003C\/strong\u003E. The work is funded in part by Cisco.\u003C\/p\u003E\r\n","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EResearchers at Georgia Tech and Northeastern University conducted a study with information workers to learn about their perspectives on being monitored and their information being collected with passive-sensing enabled artificial intelligence (PSAI), where computing devices can unobtrusively detect and collect user behaviors.\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"New research findings show that social acceptability and select sharing of AI results in the workplace are key to future implementation."}],"uid":"32045","created_gmt":"2023-04-14 14:23:22","changed_gmt":"2023-04-14 14:26:21","author":"Ben Snedeker","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2023-04-14T00:00:00-04:00","iso_date":"2023-04-14T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"670546":{"id":"670546","type":"image","title":"pic_web_cc_vedant das swain2.png","body":"\u003Cp\u003ESchool of Interactive Computing Ph.D. candidate Vedant Das Swain, lead researcher of a study dubbed \u0022Algorithmic Power or Punishment\u0022 that identifies current boundaries of using AI \u0022sensing\u0022 tools in office spaces.\u0026nbsp;\u003Cem\u003E(Photos by Kevin Beasley\/College of Computing)\u003C\/em\u003E\u003C\/p\u003E\r\n","created":"1681482219","gmt_created":"2023-04-14 14:23:39","changed":"1681482219","gmt_changed":"2023-04-14 14:23:39","alt":"Vedant Das Swain, Ph.D. candidate in computer science at Georgia Tech.","file":{"fid":"253427","name":"pic_web_cc_vedant das swain2.png","image_path":"\/sites\/default\/files\/2023\/04\/14\/pic_web_cc_vedant%20das%20swain2.png","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/2023\/04\/14\/pic_web_cc_vedant%20das%20swain2.png","mime":"image\/png","size":472783,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2023\/04\/14\/pic_web_cc_vedant%20das%20swain2.png?itok=SziVeOTz"}}},"media_ids":["670546"],"groups":[{"id":"576481","name":"ML@GT"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[],"keywords":[],"core_research_areas":[],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EJosh Preston\u003Cbr \/\u003E\r\nResearch Communications Manager\u003Cbr \/\u003E\r\n\u003Ca href=\u0022jpreston@cc.gatech.edu\u0022\u003Ejpreston@cc.gatech.edu\u003C\/a\u003E\u003C\/p\u003E\r\n","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}