<node id="649301">
  <nid>649301</nid>
  <type>news</type>
  <uid>
    <user id="34528"><![CDATA[34528]]></user>
  </uid>
  <created>1628210755</created>
  <changed>1628210767</changed>
  <title><![CDATA[New Browser-Based Chart Builder Gives Line Graphs, Scatterplots Their Very Own Audio Track]]></title>
  <body><![CDATA[<p>A new multimodal data visualization tool for the web produces charts with a twist &ndash; these charts also represent information using carefully designed sounds for a richer, more powerful, and accessible way to experience data.</p>

<p>Released by the Georgia Institute of Technology and open-source web application Highcharts, <a href="https://sonification.highcharts.com/#/">Highcharts Sonification Studio (HSS)</a>&nbsp;enables users to enter data into a spreadsheet to create traditional visual charts such as line graphs, scatterplots, and bar charts. At the same time, the tool creates non-speech audio tracks based on the data, a process known as sonification.</p>

<p>&ldquo;The goal of this tool is to provide a simple, intuitive, and accessible way for users to import, edit, visualize, and sonify their data, and then export the results to a useful format,&rdquo; said Professor <strong>Bruce Walker</strong>, director of <a href="http://sonify.psych.gatech.edu/">Georgia Tech&rsquo;s Sonification Lab</a>. &ldquo;We want users to be able to use the tool without having to download software or write code, and without prior sonification expertise.&rdquo;</p>

<p>The data visualization+sonification approach lets users explore data with visual, auditory, or both modalities. This can lead to novel discoveries in its own right, and can also support users who may have limited ability to see or hear a given display.&nbsp;</p>

<p>&ldquo;Visually impaired readers find sonification and auditory graphs to be very useful for getting an overview of the data, as well as identifying patterns, outliers, and points of interest,&rdquo; said Walker.</p>

<p><strong>Brandon Biggs</strong>, a researcher&nbsp;and entrepreneur who is blind, highlighted the software&rsquo;s ability to allow users such as himself to create a graph that he can trust will be visually appealing.</p>

<p>&ldquo;I love how accessible all the components are with a screen-reader and how easy it is to create a sonification,&rdquo; Biggs said.</p>

<p>And for all users&mdash;even those who can see&mdash;sound can communicate information without requiring visual attention. For instance, instead of looking at a weather forecast or a chart of a stock price on a screen, imagine being able to hear the ups and downs played like a melody, with additional sounds highlighting points of interest in the data.</p>

<p>HSS is the culmination of a multi-year collaboration between Highsoft&mdash;the makers of Highcharts&mdash;and the Georgia Tech Sonification Lab. The goal of the collaboration is to develop an extensible, accessible, online spreadsheet and multimodal graphing platform for the auditory display, assistive technology, and STEM education community.</p>

<p>Walker said that HSS is a systematic re-implementation of his lab&rsquo;s Sonification Sandbox to integrate Highsoft&rsquo;s industry-leading web-based Highcharts technology with Georgia Tech&rsquo;s expertise in sonification and interactive auditory displays.</p>

<p>The tool is open-sourced under the MIT License to allow for extensions and forks in development from the community&nbsp;and to ensure the tool is available to all. A Highcharts license is required for commercial use of the tool, but otherwise, usage is completely free.</p>

<p>&ldquo;This system will complement other tools and libraries actively used by the auditory display research community and help bring sonification to an even wider audience, especially in the visualization community and in situations of limited resources,&rdquo; said <strong>&Oslash;ystein Moseng</strong>, the Highcharts developer leading the implementation of the HSS.</p>

<p>A paper describing the research and development of the open-source tool is part of the 26<sup>th</sup> annual International Conference on Auditory Displays (ICAD.org), which took place June 25-28, 2021. The paper <em>Highcharts Sonification Studio: An Online, Open-Source, Extensible, And Accessible Data Sonification Tool</em> is co-authored by Stanley Cantrell, Walker, and Moseng.</p>

<p>The Highcharts Sonification Studio web app, source code, and developer community are available at <a href="https://sonification.highcharts.com">https://sonification.highcharts.com</a>.</p>

<p>&nbsp;</p>
]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2021-08-05T00:00:00-04:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Georgia Tech researchers have created a data visualization plus sonification approach lets users explore data with visual, auditory, or both modalities.]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Georgia Tech researchers have created a data visualization plus sonification approach lets users explore data with visual, auditory, or both modalities.</p>
]]></value>
    </item>
  </field_summary>
  <field_media>
          <item>
        <nid>
          <node id="649088">
            <nid>649088</nid>
            <type>image</type>
            <title><![CDATA[Data vis sonification tool]]></title>
            <body><![CDATA[]]></body>
                          <field_image>
                <item>
                  <fid>246435</fid>
                  <filename><![CDATA[sonify-2.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/images/sonify-2.jpg]]></filepath>
                  <file_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/sonify-2.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[A user working with accessible browser-based Highcharts Sonification Studio software.]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[Jpreston@cc.gatech.edu]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[<p>Josh Preston, Research Communications Mgr.<br />
<a href="mailto:Jpreston@cc.gatech.edu?subject=Sonification">Jpreston@cc.gatech.edu</a></p>
]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>1278</item>
          <item>443951</item>
      </og_groups>
  <og_groups_both>
          <item>
        <![CDATA[Research]]>
      </item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>135</tid>
        <value><![CDATA[Research]]></value>
      </item>
      </field_categories>
  <core_research_areas>
          <term tid="39501"><![CDATA[People and Technology]]></term>
      </core_research_areas>
  <field_news_room_topics>
      </field_news_room_topics>
  <links_related>
          <link>
      <url>https://youtu.be/VdKcyGXLyvg</url>
      <title></title>
      </link>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>1278</item>
          <item>443951</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[College of Sciences]]></item>
          <item><![CDATA[School of Psychology]]></item>
      </og_groups_both>
  <field_keywords>
          <item>
        <tid>170772</tid>
        <value><![CDATA[Sonification]]></value>
      </item>
          <item>
        <tid>438</tid>
        <value><![CDATA[data]]></value>
      </item>
          <item>
        <tid>7257</tid>
        <value><![CDATA[visualization]]></value>
      </item>
          <item>
        <tid>167710</tid>
        <value><![CDATA[School of Psychology]]></value>
      </item>
      </field_keywords>
  <field_userdata>
      <![CDATA[]]>
  </field_userdata>
</node>
