Expanding the Possibilities for Accessible Data Visualization
iSchool professor and Trace affiliate faculty Dr. Niklas Elmqvist leads a team to continue work on accessible data visualization and to make a high school data science course accessible for all students with a $2 million contract from the Maryland State Department of Education.
Extracting insights from massive datasets in order to answer questions and make decisions — this is the primary objective of data science. Deriving clarity from noise. The resulting communication of these insights often involves the production of data visualizations – shifting our focus from numerical values to graphical elements in order to facilitate understanding. However, not everyone has access to these data visualizations, and this barrier impedes Blind students who seek to pursue the field of data science.
Dr. Niklas Elmqvist, professor in the UMD College of Information Studies (iSchool) and affiliate faculty of the Trace R&D Center which focuses on accessibility research, is facing this seemingly intractable problem head on, so that high school students first here in Maryland, and eventually around the country, will be able to participate in this increasingly critical and exciting field of study.
A recently awarded $2 million contract from the Maryland State Department of Education (MSDE) will fund Dr. Elmqvist and his team, including iSchool PhD student Pramod Chundury, and their partners at Data11y led by Dr. Andreas Stefik and Prime Access Consulting (PAC) led by Sina Bahram to create accessible materials for a high school data science course so that all students have equal access. The work at Data11y will center on the creation of materials and implementation, while the work at UMD will focus on evaluation of the materials, expanding possibilities for accessibility solutions for data visualization.
The challenges are significant. While the screen readers often used by the Blind or visually impaired (i.e., software programs that read the text displayed on the computer screen with a speech synthesizer) can navigate tables, they are not scalable to large datasets. Nor are braille displays congruous with this task. There is a lot of material produced in the field of data science for sighted users that simply cannot be easily replicated for Blind users. For the coursework to be accessible, a solution must afford the ability to read the data in charts, interpret and communicate workflows, facilitate interactability with the data, and describe the data visualizations.
“Sensory substitution” is a general assistive technology approach to replace one sense with another, and could be used here to substitute visual information with tactile or auditory information. For example, using sound representation, a pitch might represent the height of a bar in a bar chart — the higher the pitch, the higher the bar. Yet many of the current sensory substitution techniques are not scalable because they take too much time to create (e.g., waiting for thermaform printing), are too costly, or are not widely available.
Elmqvist and Chundury are building on work originally undertaken by Drs. Ben Shneiderman (UMD professor emeritus of computer science), Catherine Plaisant (UMD research scientist emerita), Jonathan Lazar (then a professor at Towson University, now an iSchool professor and director of the Trace Center), and others more than a decade ago called iSonic. The tool creates an audio version of a map that allows a user to execute sweeps left to right to hear different pitches that indicate the mapped data — for example, unemployment rates across the United States. The map is also divided into areas that correspond to a numerical keypad to allow targeted auditory exploration of specific regions of interest.
The new project will enable this idea to be used on mobile devices and will include additional types of data representations. One focus of the work will be the facilitation of interaction between the user and the data. For example, sighted users interact in a variety of ways with charts and other representations of data, including through their eyes (e.g., taking in the graphical information), as well as by zooming in, selecting particular points, filtering data, etc. This work seeks to identify and generalize these types of interactions and then make them accessible to Blind users through other means, including both sound and haptic feedback and allowing for inputs via keyboards and touchscreens.
The MDSE contract will fund this work through September 2023.