{"400051":{"#nid":"400051","#data":{"type":"news","title":"Research advances security and trust in reconfigurable devices","body":[{"value":"\u003Cp\u003EA research team at the \u003Ca href=\u0022http:\/\/www.gtri.gatech.edu\/\u0022\u003EGeorgia Tech Research Institute\u003C\/a\u003E (GTRI) is studying a range of security challenges involving programmable logic devices \u2013 in particular, field programmable gate arrays (FPGAs).\u003C\/p\u003E\u003Cp\u003EFPGAs are integrated circuits whose hardware can be reconfigured \u2013 even partially during run-time \u2013 enabling users to create their own customized, evolving microelectronic designs. They combine hardware performance and software flexibility so well that they\u0027re increasingly used in aerospace, defense, consumer devices, high-performance computing, vehicles, medical devices, and other applications.\u003C\/p\u003E\u003Cp\u003EBut these feature-rich devices come with potential vulnerabilities \u2013 the very configurability of an FPGA can be used to compromise its security. The slightest tweak, accidental or malicious, to the internal configuration of a programmable device can drastically affect its functionality. Conversely, when security and trust assurances can be established for these devices, they can provide increased, higher-performance resilience against cyber attacks than difficult-to-assure software-based protections.\u003C\/p\u003E\u003Cp\u003EThe GTRI researchers have identified multiple issues that could become serious threats as these devices become increasingly common.\u003C\/p\u003E\u003Cp\u003E\u0022Because FPGAs are programmable and they tightly couple software and hardware interfaces, there\u0027s concern they may introduce a whole new class of vulnerabilities compared to other microelectronic devices,\u0022 said Lee W. Lerner, a researcher who leads the GTRI team studying FPGA security. \u0022There are entirely new attack vectors to consider, ones that lie outside the traditional computer security mindset.\u0022\u003C\/p\u003E\u003Cp\u003EConventional protections such as software or network-based security measures could be undermined by altering the logic of a system utilizing programmable devices.\u003C\/p\u003E\u003Cp\u003E\u0022The potential to access and modify the underlying hardware of a system is like hacker Nirvana,\u0022 Lerner said.\u003C\/p\u003E\u003Cp\u003ETraditional hardware security evaluation practices \u2013 such as X-raying chips to look for threats built-in during manufacturing \u2013 are of little use since an FPGA could be infected with Trojan logic or malware after system deployment. Most programmable devices are still at risk, including those embedded in autonomous vehicles, critical infrastructure, wearable computing devices, and in the Internet of Things, a term that refers to online control devices ranging from smart thermostats to industrial systems.\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EMyriad Possibilities\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EFPGA chips are constructed from heterogeneous logic blocks such as digital signal processors, block memory, processor cores, and arrays of programmable electronic logic gates. They also include a vast interconnected array that implements signal routing between logic blocks. Their functionality is dictated by the latest configuration bitstream downloaded to the device, commonly referred to as a design.\u003C\/p\u003E\u003Cp\u003EAn FPGA\u0027s adaptability gives it clear advantages over the familiar application-specific integrated circuit (ASIC), which comes from the foundry with its functionality permanently etched in silicon. Unlike an ASIC, for instance, an FPGA containing some sort of error can often be quickly fixed in the field. One example application which utilizes this flexibility well is software-defined radio, where an FPGA can function as one type of signal-processing circuit and then quickly morph into another to support a different type of waveform.\u003C\/p\u003E\u003Cp\u003EThe earliest FPGAs appeared 30 years ago, and today their logic circuits can replicate a wide range of reconfigurable devices including entire central processing units and other microprocessors. New internal configurations are using high-level programming languages and synthesis tools, or low-level hardware description languages and implementation tools, which can reassemble an FPGA\u0027s internal structures.\u003C\/p\u003E\u003Cp\u003EDepending on how they are set up, FPGAs can be configured from external sources or even internally by sub-processes. Lerner refers to their internal configuration capability as a type of \u0022self-surgery\u0022 \u2013 an analogy for how risky it can be.\u003C\/p\u003E\u003Cp\u003EAdditionally, because FPGA architectures are so dense and heterogeneous, it\u0027s very difficult to fully utilize all their resources with any single design, he explained.\u003C\/p\u003E\u003Cp\u003E\u0022For instance, there are many possibilities for how to make connections between logic elements,\u0022 he said. \u0022Unselected or unused resources can be used for nefarious things like implementing a Trojan function or creating an internal antenna.\u0022\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EAnticipating Attacks\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003ETo exploit an FPGA\u0027s vast resources, bad actors might find ways to break into the device or steal design information. Lerner and his team are investigating ways in which hackers might gain the critical knowledge necessary to compromise a chip.\u003C\/p\u003E\u003Cp\u003EOne potential avenue of attack involves \u0022side-channels\u0022 \u2013 physical properties of circuit operation that can be monitored externally. A knowledgeable enemy could probe side-channels, such as electromagnetic fields or sounds emitted by a working device, and potentially gain enough information about its internal operations to crack even mathematically sound encryption methods used to protect the design.\u003C\/p\u003E\u003Cp\u003EIn another scenario, third-party intellectual property modules or even design tools from FPGA manufacturers could harbor malicious functionality; such modules and tools typically operate using proprietary formats that are difficult to verify. Alternatively, a rogue employee or intruder could simply walk up to a board and reprogram an FPGA by accessing working external test points. In some systems, wireless attacks are a possibility as well.\u003C\/p\u003E\u003Cp\u003EFPGAs even contend with physical phenomena to maintain steady operation. Most reprogrammable chips are susceptible to radiation-induced upsets. Incoming gamma rays or high-energy particles could flip configuration values, altering the design function.\u003C\/p\u003E\u003Cp\u003ELerner points to a real-world example: Google Glass, the well-known head-mounted optical technology, which uses an FPGA to control its display.\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EMultiple Security Techniques\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003ETo provide assurance in programmable logic designs, Lerner and his team are developing multiple techniques, such as:\u003C\/p\u003E\u003Cul\u003E\u003Cli\u003EInnovative visualization methods that enable displaying\/identifying\/navigating patterns in massive logic designs that could include hundreds of thousands of nodes and connections;\u003C\/li\u003E\u003Cli\u003EApplications of high-level formal analysis tools, which aid the validation and verification process;\u003C\/li\u003E\u003Cli\u003ESystem-level computer simulations focused on emulating how heterogeneous microelectronics like FPGAs function alongside other system components.\u003C\/li\u003E\u003C\/ul\u003E\u003Cp\u003EThe GTRI team is also engaged in other areas of research that support design security analysis, including exact- and fuzzy-pattern matching, graph analytics, machine learning \/ emergent behavior, logic reduction, waveform simulation, and large graph visualization.\u003C\/p\u003E\u003Cp\u003EThe team also researches architectures to support trustworthy embedded computing in a variety of applications, such as cyber-physical control. They have developed the Trustworthy Autonomic Interface Guardian Architecture (TAIGA), a digital measure that is mapped onto a configurable chip such as an FPGA and is wrapped around the interfaces of process controllers. Its goal is to establish a \u0022root-of-trust\u0022 in the system, a term that refers to a set of functions that can always be trusted, in this case to preserve system safety and security.\u003C\/p\u003E\u003Cp\u003ETAIGA monitors how an embedded controller process is functioning within the system, to assure that it\u0027s controlling the process within specification. Because TAIGA can detect if something is trying to tamper with the physical process under control, it removes the need to fully trust other more vulnerable parts of the system such as supervisory software processes or even the control code itself.\u003C\/p\u003E\u003Cp\u003E\u0022TAIGA ensures process stability \u2013 even if that requires overriding commands from the processor or supervisory nodes,\u0022 Lerner said. \u0022It\u0027s analogous to the autonomic nervous system of the body, which keeps your heart beating and your lungs respiring \u2013 the basic things that your body should be doing to be in a stable state, regardless of anything else that\u0027s going on.\u0022\u003C\/p\u003E\u003Cp\u003EThe team has installed a version of the TAIGA system on a small robot running the Linux operating system. Georgia Tech students and other interested persons are invited to manipulate the installation and the robot online to try to compromise its control system at the team\u2019s main website, \u003Ca href=\u0022http:\/\/configlab.gatech.edu\u0022 title=\u0022http:\/\/configlab.gatech.edu\u0022\u003Ehttp:\/\/configlab.gatech.edu\u003C\/a\u003E, when the experiment is ready.\u003C\/p\u003E\u003Cp\u003E\u0022We provide formal assurances that TAIGA will prevent anyone from hacking critical control processes and causing the robot to perform actions deemed unsafe,\u0022 Lerner said. \u0022However, if someone figures out how to run the robot into a wall or damage its cargo, for instance, then obviously we\u0027ll know we have more work to do.\u0022\u003Cbr \/\u003E \u003Cbr \/\u003E\u003Cstrong\u003EResearch News\u003C\/strong\u003E\u003Cbr \/\u003E\u003Cstrong\u003EGeorgia Institute of Technology\u003C\/strong\u003E\u003Cbr \/\u003E\u003Cstrong\u003E177 North Avenue\u003C\/strong\u003E\u003Cbr \/\u003E\u003Cstrong\u003EAtlanta, Georgia 30332-0181\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EMedia Relations Contacts\u003C\/strong\u003E: John Toon (404-894-6986) (\u003Ca href=\u0022mailto:jtoon@gatech.edu\u0022\u003Ejtoon@gatech.edu\u003C\/a\u003E) or Lance Wallace (404-407-7280) (\u003Ca href=\u0022mailto:lance.wallace@gtri.gatech.edu\u0022\u003Elance.wallace@gtri.gatech.edu\u003C\/a\u003E).\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EWriter\u003C\/strong\u003E: Rick Robinson\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EA research team at the Georgia Tech Research Institute (GTRI) is studying a range of security challenges involving programmable logic devices \u2013 in particular, field programmable gate arrays (FPGAs). \u0026nbsp;\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Researchers are studying a range of security challenges involving programmable logic devices."}],"uid":"27303","created_gmt":"2015-04-27 20:32:27","changed_gmt":"2016-10-08 03:18:08","author":"John Toon","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2015-04-28T00:00:00-04:00","iso_date":"2015-04-28T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"400041":{"id":"400041","type":"image","title":"FPGA Testing2","body":null,"created":"1449246388","gmt_created":"2015-12-04 16:26:28","changed":"1475895117","gmt_changed":"2016-10-08 02:51:57","alt":"FPGA Testing2","file":{"fid":"75791","name":"fpga1.jpg","image_path":"\/sites\/default\/files\/images\/fpga1.jpg","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/fpga1.jpg","mime":"image\/jpeg","size":1463593,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/fpga1.jpg?itok=bPTgKk4c"}},"400031":{"id":"400031","type":"image","title":"FPGA Testing","body":null,"created":"1449246388","gmt_created":"2015-12-04 16:26:28","changed":"1475895117","gmt_changed":"2016-10-08 02:51:57","alt":"FPGA Testing","file":{"fid":"75790","name":"fpga2.jpg","image_path":"\/sites\/default\/files\/images\/fpga2.jpg","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/fpga2.jpg","mime":"image\/jpeg","size":1849345,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/fpga2.jpg?itok=Ik4loLyR"}}},"media_ids":["400041","400031"],"groups":[{"id":"1188","name":"Research Horizons"}],"categories":[{"id":"153","name":"Computer Science\/Information Technology and Security"},{"id":"147","name":"Military Technology"},{"id":"135","name":"Research"}],"keywords":[{"id":"124871","name":"FPGA"},{"id":"416","name":"GTRI"},{"id":"63161","name":"integrated circuits"},{"id":"124901","name":"programmable logic"},{"id":"167055","name":"security"}],"core_research_areas":[{"id":"39451","name":"Electronics and Nanotechnology"},{"id":"39481","name":"National Security"}],"news_room_topics":[{"id":"71881","name":"Science and Technology"}],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EJohn Toon\u003C\/p\u003E\u003Cp\u003EResearch News\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022mailto:jtoon@gatech.edu\u0022\u003Ejtoon@gatech.edu\u003C\/a\u003E\u003C\/p\u003E\u003Cp\u003E(404) 894-6986\u003C\/p\u003E","format":"limited_html"}],"email":["jtoon@gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}