{"156701":{"#nid":"156701","#data":{"type":"news","title":"Georgia Tech Creating High-Tech Tools to Study Autism","body":[{"value":"\u003Cp class=\u0022Body1\u0022\u003EResearchers in Georgia Tech\u2019s Center for Behavior Imaging have developed two new technological tools that automatically measure relevant behaviors of children, and promise to have significant impact on the understanding of behavioral disorders such as autism.\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003EOne of the tools\u2014a system that uses special gaze-tracking glasses and facial-analysis software to identify when a child makes eye contact with the glasses-wearer\u2014was created by combining two existing technologies to develop a novel capability of automatic detection of eye contact. The other is a wearable system that uses accelerometers to monitor and categorize problem behaviors in children with behavioral disorders.\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003EBoth technologies already are being deployed in the Center for Behavior Imaging\u2019s (CBI) ongoing work to apply computational methods to screening, measurement and understanding of autism and other behavioral disorders.\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003EChildren at risk for autism often display distinct behavioral markers from a very young age. One such marker is a reluctance to make frequent or prolonged eye contact with other people. Discovering an automated way to detect this and other telltale behavioral markers would be a significant step toward scaling autism screening up to much larger populations than are currently reached. This is one goal of the five-year, $10 million \u201cExpeditions\u201d project, funded in fall 2010 by the National Science Foundation under principal investigator and CBI Director Jim Rehg, also a professor in Georgia Tech\u2019s School of Interactive Computing.\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003EThe eye-contact tracking system begins with a commercially available pair of glasses that can record the focal point of their wearer\u2019s gaze. Researchers took video of a child captured by a front-facing camera on the glasses, worn by an adult who was interacting with the child. The video was then processed using facial recognition software available from a second manufacturer. Combine the glasses\u2019 hard-wired ability to detect wearer gaze with the facial-recognition software\u2019s ability to detect the child\u2019s gaze direction, and the result is a system able to detect eye contact in a test interaction with a 22-month-old with 80 percent accuracy. The study was conducted in Georgia Tech\u2019s Child Study Lab (CSL), a child-friendly experimental facility richly equipped with cameras, microphones and other sensors.\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003E\u201cEye gaze has been a tricky thing to measure in laboratory settings, and typically it\u2019s very labor-intensive, involving hours and hours of looking at frames of video to pinpoint moments of eye contact,\u201d Rehg said. \u201cThe exciting thing about our method is that it can produce these measures automatically and could be used in the future to measure eye contact outside the laboratory setting. We call these results preliminary because they were obtained from a single subject, but all humans\u2019 eyes work pretty much the same way, so we\u2019re confident the successful results will be replicated with future subjects.\u201d \u0026nbsp;\u003C\/p\u003E\u003Cp\u003EThe other new system, developed in collaboration with the Marcus Autism Center in Atlanta and Dr. Thomas Ploetz of Newcastle University in the United Kingdom, is a package of sensors, worn via straps on the wrists and ankles, that uses accelerometers to detect movement by the wearer. Algorithms developed by the team analyze the sensor data to automatically detect episodes of problem behavior and classify them as aggressive, self-injurious or disruptive (e.g., throwing objects).\u003C\/p\u003E\u003Cp\u003EResearchers first developed the algorithms by putting the sensors on four Marcus clinic staff members who together performed some 1,200 different behavior instances, and the system detected \u201cproblem\u201d behaviors with 95 percent accuracy and classified all behaviors with 80 percent accuracy. They then used the sensors with a child diagnosed along the autism spectrum, and the system detected the child\u2019s problem-behavior episodes with 81 percent accuracy and classified them with 70 percent accuracy.\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003E\u201cThese results are very promising in leading the way toward more accurate and reliable measurement of problem behavior, which is important in determining whether treatments targeting these behaviors are working,\u201d said CSL Director Agata Rozga, a research scientist in the School of Interactive Computing and co-investigator on the Expeditions award. \u201cOur ultimate goal with this wearable sensing system is to be able to gather data on the child\u2019s behavior beyond the clinic, in settings where the child spends most of their time, such as their home or school. In this way, parents, teachers and others who care for the child can be potentially alerted to times and situations when problem behaviors occur so that they can address them immediately.\u201d\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003E\u201cWhat these tools show is that computational methods and technologies have great promise and potential impact on the lives of many children and their parents and caregivers,\u201d said Gregory Abowd, Regents\u2019 Professor in the School of Interactive Computing and a prominent researcher in technology and autism. \u201cThese technologies we are developing, and others developed and explored elsewhere, aim to bring more effective early-childhood screening to millions of children nationwide, as well as enhance care for those children already diagnosed on the autism spectrum.\u201d\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003EBoth technologies were presented in early September at the 14\u003Csup\u003Eth\u003C\/sup\u003E ACM International Conference on Ubiquitous Computing (Ubicomp 2012). Among the other devices under study at CSL are a camera\/software system that can track children\u2019s facial expressions and customized speech analysis software to detect vocalization patterns.\u003C\/p\u003E\u003Cp class=\u0022Body1\u0022\u003EFor more information on behavioral imaging, visit the Georgia Tech\/NSF website on computational behavioral science at \u003Ca href=\u0022http:\/\/www.cbs.gatech.edu\/\u0022\u003Ehttp:\/\/www.cbs.gatech.edu\u003C\/a\u003E. For information or to volunteer for one of CBI\u2019s ongoing studies, visit the Child Study Lab website at \u003Ca href=\u0022http:\/\/childstudy.hsi.gatech.edu\/\u0022\u003Ehttp:\/\/childstudy.hsi.gatech.edu\u003C\/a\u003E.\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":[{"value":"Innovations will lead to better treatment, assessment for children"}],"field_summary":[{"value":"\u003Cp class=\u0022Body1\u0022\u003EResearchers in Georgia Tech\u2019s Center for Behavior Imaging have developed two new technological tools that automatically measure relevant behaviors of children, and promise to have significant impact on the understanding of behavioral disorders such as autism.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":"","uid":"27560","created_gmt":"2012-09-25 10:19:43","changed_gmt":"2016-10-08 03:12:50","author":"Jason Maderer","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2012-09-25T00:00:00-04:00","iso_date":"2012-09-25T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"156711":{"id":"156711","type":"image","title":"Child Study Lab","body":null,"created":"1449178872","gmt_created":"2015-12-03 21:41:12","changed":"1475894792","gmt_changed":"2016-10-08 02:46:32","alt":"Child Study Lab","file":{"fid":"195303","name":"child-study-lab-1_0.jpg","image_path":"\/sites\/default\/files\/images\/child-study-lab-1_0_0.jpg","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/child-study-lab-1_0_0.jpg","mime":"image\/jpeg","size":598703,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/child-study-lab-1_0_0.jpg?itok=gxnpcago"}},"60510":{"id":"60510","type":"image","title":"Gregory Abowd and James Rehg","body":null,"created":"1449176267","gmt_created":"2015-12-03 20:57:47","changed":"1475894525","gmt_changed":"2016-10-08 02:42:05","alt":"Gregory Abowd and James Rehg","file":{"fid":"191144","name":"tzo30302.jpg","image_path":"\/sites\/default\/files\/images\/tzo30302_0.jpg","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/tzo30302_0.jpg","mime":"image\/jpeg","size":1530299,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/tzo30302_0.jpg?itok=CZnAPVtQ"}}},"media_ids":["156711","60510"],"groups":[{"id":"1214","name":"News Room"}],"categories":[{"id":"135","name":"Research"}],"keywords":[{"id":"6053","name":"Autism"},{"id":"44431","name":"Child Study Lab"},{"id":"397","name":"children"}],"core_research_areas":[{"id":"39501","name":"People and Technology"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp class=\u0022Body1\u0022\u003EMichael Terrazas\u003Cbr \/\u003EAssistant Director of Communications\u003Cbr \/\u003ECollege of Computing\u003Ca href=\u0022mailto:mterraza@cc.gatech.edu\u0022\u003E\u003Cbr \/\u003Emterraza@cc.gatech.edu\u003C\/a\u003E\u003Cbr \/\u003E404-245-0707\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E","format":"limited_html"}],"email":["mterraza@cc.gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}