{"669338":{"#nid":"669338","#data":{"type":"news","title":"New Technology Promises More Efficient and Practical Virtual Reality Systems","body":[{"value":"\u003Cp\u003EGlitchy games and bulky headsets may soon be things of the past thanks to a\u0026nbsp;new\u0026nbsp;eye-tracking system\u0026nbsp;for\u0026nbsp;virtual reality\/augmented reality\u0026nbsp;(VR\/AR).\u0026nbsp;\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EEye tracking is an essential component of AR\/VR systems, but current systems have some limitations. These include a large size due to bulkier lens-based cameras and the high communication cost between the camera and the backend system.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EGeorgia Tech School of Computer Science Associate Professor Yingyan (Celine) Lin, Ph.D. student Hoaran You, and postdoctoral student Yang (Katie) Zhao have\u0026nbsp;developed a new\u0026nbsp;eye-tracking\u0026nbsp;system\u0026nbsp;that works around these limitations by combining\u0026nbsp;a\u0026nbsp;recently developed lens-less\u0026nbsp;camera, algorithm, and acceleration processor designs.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cThe current VR headsets are too heavy, gaming can lag, and using the controller is cumbersome. Combined, this prevents users from having a truly immersive experience. We mitigate all these problems,\u201d said You.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThis new system,\u0026nbsp;\u003Cem\u003EEyeCoD: An Accelerated Eye Tracking System via FlatCam-based Algorithm \u0026amp; Accelerator Co-Design\u003C\/em\u003E, replaces the traditional camera lens with FlatCam, a lensless camera 5x \u2013 10x thinner and lighter. Combined with FlatCam, the team\u2019s system enables eye tracking to function at a reduced size, with improved efficiency, and without sacrificing the accuracy of the tracking algorithm. The system could also enhance user privacy by not including a lens-based camera.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAnother feature of the\u0026nbsp;\u003Cem\u003EEyeCoD\u003C\/em\u003E\u0026nbsp;system\u0026nbsp;is that it\u0026nbsp;only puts the\u0026nbsp;portion\u0026nbsp;of the\u0026nbsp;screen\u0026nbsp;that\u0026nbsp;a\u0026nbsp;user\u2019s eyes focus on\u0026nbsp;in high resolution.\u0026nbsp;It does this by predicting where a user\u2019s eyes may land, then\u0026nbsp;instantaneously\u0026nbsp;rendering\u0026nbsp;these areas\u0026nbsp;in high\u0026nbsp;res. These\u0026nbsp;computational savings,\u0026nbsp;plus\u0026nbsp;a dedicated accelerator,\u0026nbsp;underpin\u0026nbsp;\u003Cem\u003EEyeCoD\u003C\/em\u003E\u2019s\u0026nbsp;ability to boost\u0026nbsp;processing speeds and efficiency.\u0026nbsp;\u0026nbsp;\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe\u0026nbsp;team received the\u0026nbsp;\u003Ca href=\u0022https:\/\/licensing.research.gatech.edu\/\u0022 target=\u0022_blank\u0022\u003EOffice of Technology Licensing\u2019s\u003C\/a\u003E\u0026nbsp;Tech Ready Grant\u0026nbsp;for its efforts earlier this year. Tech Ready Grants\u0026nbsp;offer\u0026nbsp;$25,000 to help faculty\u0026nbsp;transition\u0026nbsp;projects from the lab to\u0026nbsp;the\u0026nbsp;marketplace.\u0026nbsp;\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe team hopes to use the funds to integrate the current demos into a compact eye-tracking system for use in commercial VR\/AR headsets.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAlong with winning the Tech Ready Grant, the\u0026nbsp;team presented\u0026nbsp;\u003Cem\u003EEyeCoD\u0026nbsp;\u003C\/em\u003Eat\u0026nbsp;the\u0026nbsp;International Symposium\u003Cstrong\u003E\u0026nbsp;\u003C\/strong\u003Eon Computer Architecture\u202f(ISCA) 2022.\u0026nbsp;IEEE\u0026nbsp;Micro\u0026nbsp;included the\u0026nbsp;work\u0026nbsp;in its\u0026nbsp;\u003Cem\u003ETop Picks from the Computer Architecture Conferences\u003C\/em\u003E\u0026nbsp;for 2023.\u0026nbsp;The annual publication\u0026nbsp;highlights \u201csignificant research papers in computer architecture based on novelty and potential for long-term impact.\u201d\u0026nbsp;\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cem\u003EEyeCoD\u0026nbsp;\u003C\/em\u003Eis a collaborative work.\u0026nbsp;Collaborators include\u0026nbsp;Rice University Professor\u0026nbsp;Ashok\u0026nbsp;Veeraraghavan, whose team provided\u0026nbsp;the technical support and design of the\u0026nbsp;FlatCam\u0026nbsp;camera in\u0026nbsp;\u003Cem\u003EEyeCoD\u003C\/em\u003E;\u0026nbsp;and\u0026nbsp;Ziyun\u0026nbsp;Li, of Meta, who provided\u0026nbsp;technical inputs to ensure that the\u0026nbsp;EyeCoD\u0026nbsp;system aligns with industry AR\/VR specifications.\u0026nbsp;\u0026nbsp;\u003C\/p\u003E\r\n","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EA new eye-tracking system called EyeCoD, developed by researchers at Georgia Tech, uses a lensless camera to reduce the size and weight of VR\/AR headsets, improve efficiency, and enhance user privacy while selectively rendering high-resolution screen areas based on where the user is focusing at any given moment.\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"A new eye-tracking system developed at Georgia Tech uses a lensless camera to reduce the size and weight of VR\/AR headsets, improves efficiency, and enhances user privacy."}],"uid":"32045","created_gmt":"2023-09-01 12:51:20","changed_gmt":"2023-09-01 12:58:03","author":"Ben Snedeker","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2023-08-30T00:00:00-04:00","iso_date":"2023-08-30T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"671567":{"id":"671567","type":"image","title":"A closeup of glass panels on the College of Computing\u0027s Binary Bridge","body":null,"created":"1693572695","gmt_created":"2023-09-01 12:51:35","changed":"1693572695","gmt_changed":"2023-09-01 12:51:35","alt":"A closeup of glass panels on the College of Computing\u0027s Binary Bridge","file":{"fid":"254652","name":"news-default-image - New.png","image_path":"\/sites\/default\/files\/2023\/09\/01\/news-default-image%20-%20New.png","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/2023\/09\/01\/news-default-image%20-%20New.png","mime":"image\/png","size":549085,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2023\/09\/01\/news-default-image%20-%20New.png?itok=Y9Lzgtn4"}}},"media_ids":["671567"],"groups":[{"id":"50875","name":"School of Computer Science"},{"id":"47223","name":"College of Computing"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[{"id":"135","name":"Research"}],"keywords":[{"id":"10199","name":"Daily Digest"}],"core_research_areas":[{"id":"39501","name":"People and Technology"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EMorgan Usry, Communications Officer I\u003C\/p\u003E\r\n\r\n\u003Cp\u003ESchool of Computer Science\u003C\/p\u003E\r\n\r\n\u003Cp\u003Emorgan.usry@cc.gatech.edu\u003C\/p\u003E\r\n","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}