The PhET Interactive Simulations project (http://phet.colorado.edu) has begun designing inclusive features into our new suite of HTML5 simulations. With 130+ mathematics and science simulations, including over 30 chemistry simulations – the PhET project aims to ensure that all students have access to these open educational resources. Inclusive features include: keyboard navigation, text-to-speech and auditory descriptions, and sonification. These features will allow students to engage with the simulations in multiple modes, with visual, auditory, and textual representations available, along with expanded options for input and output methods (keyboard, screen readers, etc.). These inclusive features have the potential of increasing the effectiveness of simulations for all students – including those with disabilities. In this work, we share our progress in inclusively designed PhET simulations, and highlight the design and implementation of keyboard navigation within a dynamic, interactive chemistry simulation.
Accessibility for PhET Interactive Simulations: Progress, Challenges, and Potential
Emily B. Moore
Abstract
The PhET Interactive Simulations project (http://phet.colorado.edu) has begun designing accessibility features into our new suite of HTML5 simulations. With 130+ science and mathematics simulations, including over 30 on chemistry topics, the PhET project aims to ensure that all students have access to these open educational resources. Accessibility features in development include: keyboard navigation, text-to-speech, auditory descriptions, and sonification. These features will allow students to engage with the simulations in multiple modes, with visual, auditory, and textual representations available, along with expanded options for input and output methods (keyboard, screen readers, etc.). These accessibility features create an opportunity to increase the effectiveness of simulations for all students – including those with disabilities. In this work, we share our progress, challenges, and the potential of accessible PhET simulations.
Introduction
The PhET Interactive Simulations project at the University of Colorado Boulder (http://phet.colorado.edu) has engaged in the design and development of free interactive simulations (sims) for learning topics in science and mathematics for over a decade. The result is a suite of over 130 interactive sims – including over 30 sims on chemistry topics – with supporting teacher materials.
Throughout the course of the PhET project, we have prioritized free and easy access for teachers and students, as well as intuitive, exploratory, easy-to-use sim designs. Our efforts to support free and easy access have included significant efforts to internationalize the sims and to support their operability across diverse platforms and devices. Internationalization efforts include the development of the PhET Translation Utility,1which allows volunteers around the world to translate the PhET sims (or the entire PhET website) into their local language or dialect and to share their translation with the world through the PhET website. Because of this utility and the efforts of many volunteers the PhET website is available in 34 languages, and sims are available in 79 languages. Operability efforts include our transition in 2013 from developing sims in Java and Flash to developing sims in HTML5, a markup language that allows PhET sims to run seamlessly in-browser and offline on a wide variety of devices – across desktop, laptop, and mobile devices. You can find the list of our growing collection of HTML5 sims on our website at: https://phet.colorado.edu/en/simulations/category/html, along with a short video introduction to the enhanced capabilities of this collection.
Starting in 2014, PhET has engaged in a new initiative – to increase the accessibility of PhET sims for students with disabilities. Partnering with the Inclusive Design Research Centre at OCAD University in Ontario, Canada, we have begun development of new accessibility features for PhET sims. The new features are intended to support a broad diversity of human needs and preferences, and will focus on adding customizable affordances for all users. Because the goal of these features is to enable the inclusion of students and teachers with and without disabilities, we refer to these new features as inclusive features. Here, we share with you inclusive features in development and describe these features within the context of the States of Matter: Basics sim. Please note that while these features are being actively prototyped and refined, they are not yet publicly available. We anticipate that publication of sims with inclusively designed features will begin during the summer of 2016.
Inclusive Design Features
Inclusive design features for PhET sims will be implemented as new interactive "layers". Students and teachers will have the ability to turn these layers on or off based on their learning needs and preferences. The inclusive design features include:
· Assistive Technology Support: The ability to explore, control, and receive feedback from the sim’s interactive elements for students using an assistive technology or alternative input device. This feature will allow students using devices beyond touch, mouse, and screen visuals to interact (input) and receive feedback (output) from the sims (e.g., customized keyboards, switches, and screen magnifiers).
· Keyboard Navigation: The ability to easily and efficiently navigate and interact with the sim and its representations using keyboard input. This feature will benefit younger students who may have difficulty with a mouse interface, students with motor challenges, students with low vision, and those using certain types of assistive devices. Note: The ability to interact with a sim via keyboard is already included under the category of “Assistive Technology Support” above. With the “Keyboard Navigation” feature, we are further focused on careful iterative design of the keyboard navigation ordering. Our goal with keyboard navigation design is to minimizing the amount of time and effort required for keyboard users to engage in investigations with the sim.
· Text-to-Speech: Reading aloud of on-screen text within the sim. This capability will support students with low-vision, students with certain learning disabilities, second language leaners, and students with low-literacy in interpreting on-screen labels and readouts.
· Auditory Descriptions: Reading aloud of contextual information (e.g., describing the scene and layout), interface controls, object descriptions, and feedback descriptions. This feature will particularly benefit students using screen-readers or students with certain cognitive challenges. Two types of auditory descriptions will be provided: (i) comprehensive descriptions designed for students using screen readers and who may be unable to see the sim, and (ii) concise supplemental descriptions for younger students, students with low-vision, and students with certain cognitive challenges.
· Sonification: Sonifications of interactions (e.g., controls) and science concepts. These sonifications include interpretations of the underlying science models in a form that has been mapped to sound parameters such as pitch, volume, timbre, duration, polyphony, etc. This feature will allow students with low vision to explore science models in new ways, while also supporting any student who benefits from multisensory learning experiences (e.g., students with certain learning disabilities, and students who enjoy engaging with sound).
· Personalization: Menus that allow students to personalize the configuration of their sim experience, including the ability to turn on and off supporting features at the start of sim use, as well as adjust features on-the-fly during sim use.
Since each sim emphasizes a unique set of learning goals, stylistic approaches, and levels of complexity, these accessibility features will be designed and tailored for each sim. By inclusively designing multiple sims simultaneously, we aim to uncover areas of overlap for implementation across sims, as well as areas of uniqueness.
Inclusively Designed Concept Sim: States of Matter: Basics
As an example of inclusively designed features, and how these could benefit students, we share an example concept of an inclusively designed States of Matter: Basics (original version – without inclusive features – is available at http://phet.colorado.edu/en/simulation/states-of-matter-basics). The States of Matter: Basics sim (shown in Figure 1) has the following learning goals: 1) Explore and describe the characteristics of three states of matter: solid, liquid, and gas. 2) Predict how varying the temperature or pressure changes the particles motion and the substance’s phase. 3) Recognize that different substances have different properties – including melting, freezing, and boiling temperatures.
Figure 1. Solid, Liquid, Gas (left) and Phase Changes (right) screens of the States of Matter: Basics sim.
The States of Matter: Basics sim consists of two screens, the Solid, Liquid, Gas screen and the Phase Changes screen. In Solid, Liquid, Gas, students can explore the particle motion and phase changes of different atoms and molecules. Students can increase and decrease the temperature inside the container, and observe how the particle motion changes dynamically and results in phase changes, or by selecting specific phases they would like to view.
For this example, we will focus on the inclusive design features for the Phase Changes screen. The Phase Changes screen is similar to the Solid, Liquid, Gas screen – students find a container with particles inside, a thermometer, the ability to heat and cool the container, and options of different atoms and molecules to explore. In addition, there is the ability to “pump in” more particles into the container, to decrease the volume of the container (by pressing down on the lid), and a pressure gauge to indicate the impact of these changes.
Each inclusive feature will be available as a layer that can be turned on or off, with each layer designed to provide pedagogically relevant cueing and feedback – similar to the visual cueing and feedback already a hallmark of PhET sims. With each layer designed to provide pedagogical cueing and feedback, we can ensure that the sims support interactive, exploratory, and pedagogically powerful student and classroom use regardless of the layer selections (whether the student chooses predominately textual, auditory, or visual layers, or a combination) or assistive technology device used. Inclusive features will include:
· Assistive Technology Support. We will add support for alternative input devices such as physical and scanning keyboards, and switches, so students can change particle type, heat and cool the system, etc. without using a mouse. Semantic HTML will support the use of screen magnifiers and readers, enabling blind or students with low vision to engage with the sim’s representations. Support for built-in operating system and web platform features such as high color contrast mode will reach students who would otherwise have difficulty distinguishing particle colors or other color cueing.
· Keyboard Navigation. Each interactive element will be navigable via keyboard (or other assistive technology input device). Keyboard navigation will be grouped pedagogically, allowing students to fluidly choose whether they want to interact 1) within the play area (with the container, heating/cooling element, and/or the pump), 2) with control panel (changing atom or molecule types or showing the phase diagram), or 3) with global controls (play/ pause and reset all buttons). Sequencing keyboard navigation this way (rather than strictly through left-to-right, top-to-bottom sequencing) cues students to a pedagogically useful way of exploring the sim (changing temperature, volume, and number of particles before exploring differences across atom and molecule types), consistent with the designed-in goals of the sim as well as the ways in which we have observed students intuitively explore the sim with a mouse or touch interface.
· Text-to-Speech. When a control or readout is selected, the text will be read aloud to students. Students will be able to have headings (“Atoms & Molecules”), control labels (e.g., “Neon”), and gauges (pressure gauge readouts) read aloud to them.
· Auditory Descriptions. For each representation, tool, and control, two layers of auditory description will be added. One layer will be concise text descriptions, for example, “container partially filled with water molecules”, “pump”, “pressure gauge”, “neon”, and “argon”. Another textual layer will be verbose, including more contextual information. For example, “container partially filled with water molecules, water molecules moving quickly in container, bumping into each other and the sides of the container, while also rotating quickly,” “neon, a small atom”, and “argon, a larger atom”. Each descriptive layer will be specifically designed to provide students with pedagogical and context relevant information, cueing students to the particular relationships to compare and contrast.
· Sonification. Several sonification techniques will be used to create equivalent alternatives to visual material that would otherwise be unavailable to students with visual impairments, as well as to provide a more immersive experience for all learners. The changing velocity, rotation, and intermolecular interactions of the particles, as well as temperature and pressure, will be mapped to the pitch or timbre of different sounds, providing a sonic representation of the dynamic relationships during phase changes. When the pedagogically relevant cue maps to a sound from a students’ everyday life, Auditory Icons 2will be used (e.g., particles colliding with the container wall will be given a sound quickly recognizable as an object running into a wall). When the sonification is abstract, without a real world equivalent, Earcons 3will be used; creating synthesized auditory messages that students interpret through exploration with the sim.
· Personalization. The sim will include a personalization panel, where the inclusive layers can be explored, selected, and adjusted. As students explore the visual (e.g., larger text size, text highlighting), auditory (e.g., sonifications), and textual options (e.g., text reading, concise or verbose text support), examples of each will be provided, allowing the student or teacher to quickly recognize whether a particular feature will be beneficial.
Progress
· Prototypes. We currently have developed prototypes for keyboard navigation, auditory descriptions, and early components of sonification. You can find information about these prototypes, including links, at http://phet.colorado.edu/en/about/accessibility.
· User Testing. We have begun gathering a pool of user testers, including users with disabilities, to include in the iterative process of refining these prototypes and preparing for widespread implementation of inclusive features across the suite of PhET sims.
· Developing Connections. The accessible technology community has a long history of working together to develop guidelines and standards, to research and design accessibility features, and to consult with organizations to make technology more accessible. This is a rich research and development landscape, with novel techniques for accessible graphs, novel techniques for sonification of informal learning environments, innovative designs for the next human computer interface (haptic devices, wearable devices, etc.), strategies to increase affordability and flexibility of adaptive devices (e.g., make-at-home adaptive devices) and wonderful qualitative work into the ways in which those with differing abilities can and do work together to overcome challenges. Members of the PhET project have been attending conferences, workshops, working groups, and engaging in collaborations to learn how to build on this community’s work, and to learn how to most meaningfully contribute findings back to the community.
Challenges
Our efforts are breaking new ground in accessible interactive learning technologies, and require addressing substantial technical, design, and research challenges.
· Technical Challenges. Interactive sims are not structured like standard web pages. Current standards for creating accessible webpages that communicate well with assistive technologies (keyboards, screen readers, screen magnifiers, etc.) do not uniformly apply for sims. What currently exists are workarounds that are neither cross-platform, cross-browser, nor mobile-compatible. There are efforts to develop the necessary standards and guidelines in the accessibility community, and PhET’s prototypes (along with the work of many others) are helping to uncover the needs that these standards and guidelines will address.
• Design. There is a lack of examples of accessible interactive sims that provide exploratory science learning experiences for students and work well across a range of common assistive technologies, platforms, browsers, and devices. As mentioned in the introduction, PhET has always prioritized free and easy access to the sims, and will continue to strive to design sims that have minimal access barriers and maximum click-and-run ease. Adding inclusive features opens up new opportunities for multimodal design and will require high quality collaborative design work to ensure that future sims retain their current level of ease-of-use.
• Research. Each student has a unique set of needs and preferences when using technology - e.g., not all screen reader users can be grouped together for analysis. One approach commonly used in accessibility research is single-case studies, or within-subject designs, but this is not commonly used in science education research.
Potential
With the rise and uptake of HTML5 in educational technology we are entering a time of increased compatibility and interoperability. Now is the time to push the boundaries of accessible educational technologies. We have an unprecedented opportunity to learn from each other, develop standards we can all understand and implement, and create innovative high-quality learning tools that benefit all students – including students with disabilities.
References
(1) Adams, W. K.; Alhadlaq, H.; Malley, C. V.; Perkins, K. K.; Olson, J.; Alshaya, F.; Alabdulkareem, S.; Wieman, C. E. J. Sci. Educ. Technol. 2010.
(2) Brazil, E.; Fernström, M. The Sonification Handbook 2011.
(3) McGookin, D.; Brewster, S. The Sonification Handbook 2011.
Comments
Inclusive Design Features
Hi Emily,
The work you folks are doing is very interesting, and I can see how coupling many of these non-traditional modes to traditional visual browser-based simulations would be a value for all, disabled or not. I can see how sonification can be clearly linked to dynamic energetic and kinetic based visualizations, and have to say I thought your Faraday's Law one could have been in a Dr. Who movie,
http://www.colorado.edu/physics/phet/dev/html/faradays-law/1.0.1-sonification.9/faradays-law_en.html
I did miss what you meant by "switches" as an Assistive Technology Support on page 2.
Also, many visualizations are static structure based, and I am not seeing how many of these technologies would assist in understanding concepts like VSEPR. This leads to two thoughts/questions. First, what efforts are being done to integrate 3D printers, like maybe one that could not only make a model, but include a braille-based narrative as part of the model. Are any of these recyclable? Sort of like a play-do, that gets reformed for each simulation. Can different atoms have different textures? Have you people been interacting with any of the people in the 3D print world?
Second, is any effort being done to connect actual physical models to a simulation through a QR code placed on the model, that syncs it with the simulation, inducing a tactile element to the lesson? If a QR code can provide orientational information of an object when it was scanned, it would be possible to couple an auditory description of an animation that the person could follow by manipulating the model. Sort of merging the pre-computer molecular model kit with the computer simulation. Is any attempting being made to link physical models to simulations? Is that common?
Thank you for this very informative paper.
Sincerely,
Bob
Hi Bob,
Hi Bob,
Thanks for your questions, and for checking out some of our prototypes.
- Switch navigation is a way of interacting with your computer (or other tech device) using a switch or button-like interface - turning navigation into a sort of yes/no interaction. There are many different ways for setting this up, but one that I've seen is a switch set up behind a person's head (it was attached to the headrest in their chair). The computer was set to scan through the interactive elements, highlighting each one. When the user wanted to select ("click") an interactive element, he waited until the scanning got to that element and then he activated the switch by tapping it with his head. You can set scanning to go quite fast, and people can get really efficient with this type of interaction. In fact, you can likely turn on scanning/switch navigation for your computer, tablet, or phone right now and try it out. If you have an iPhone, for example, you can turn on switch navigation in the accessibility options menu. The phone will begin scanning through the different apps on your phone (you'll see them highlighted) and tapping anywhere on your phone screen becomes the 'switch', allowing you to interact with your phone apps using only a screen tap. To exit switch mode, just triple click your "Home" button.
- Regarding concepts that are static in nature. First, I would argue that VSEPR doesn't have to be static. Check out PhET's Molecule Shapes simulation to see what I mean. But, I know you were just throwing out an example there. I would definitely agree that some concepts aren't easily or best approached through direct interaction with a dynamic system. I do believe that with 3D printers, different sizes, shapes, and textures are all possible. I haven't explored that world too much, but I agree that overlaps and blending of tactile models and simulations could be very beneficial. One thing I have noticed is that there is a definite need for collaboration across disciplinary experts and accessibility researchers and educators using 3D printed resources. Some important components of visually-represented models can get lost in translation in creation of a 3D printed model, and in some cases, unintended features can be picked up. For example, I have seen 3D printed Bohr model representations where you could twist the outer orbitals to be perpendicular to other orbitals. To me, this could be quite confusing - I interpret the Bohr model as being intrinsically planar, and the 3D printed model allowed a student to create a spherical representation. Similar to simulations which encourage interaction, I think if you hand a tactile model to someone, they are going to touch it and see if anything moves…so movable/twistable parts could get highlighted to the student in ways that are not beneficial for conceptual understanding.
- The scenario you describe in your last question, where a physical model is synced with some sort of auditory or other feedback is probably being worked on somewhere, just not in a way specific for chemistry. This sounds to me like a cool multimodal manipulative, and would probably be great for structure models in chemistry. It also sounds like a very innovative NSF grant proposal idea in the making to me… :) At PhET, we are planning to explore the role of haptic feedback for PhET simulations. This would allow students (any students!) using a haptic device (which are small and not that expensive) that would provide physical feedback as they explore a simulation. So, rather than embedding multimodal output into a physical manipulative, we can embed a physical component to a 2D representation on a screen. Imagine being able to “feel” your way around a simulation, as if it was in 3D?
sonification
Hi Emily,
One of the articles in a recent Scientific American was about using sonification as a way to search for connections or trends in the enormous amounts of data being generated from the LHC and cosmological studies. Apparently our brains ability to handle complex data is actually greater for audio than visual and the ability to interpret and make connections can be increased. Are you considering using any of these aspects?
The link https://phet.colorado.edu/en/simulations/category/html
Doesn't seem to work -The requested page could not be found.
Sonification and HTML5 PhET Simulations
Hi!
Indeed, there are many ways sonification is being currently used. There's also a neat TED talk about sonification by Mike Ballora - https://www.youtube.com/watch?v=aQJfQXGbWQ4 for those interested in learning more. From what I understand from our collaborators at the Sonification Lab at Georgia Tech, there is a tremendous amount of information that can be conveyed through sound without overloading a student, and much work to be done to understand how to best convey information this way. A lot of work has been done with static tone graphs - adding a tone to a graph of a single line. New findings are emerging about the use of sonification for more complex and dynamic graphs, and other forms of information.
Some of the questions we're specifically interested in investigating involve the qualities of sound that can best convey STEM-specific relationships, for example cause-effect relationships. Another interesting question is how to best contextualize sonifications for students. Simple tone graphs provide no contextual information about the meaning of the data being conveyed. If a sound had some qualities that helped convey (for example) "metallic" or "inside a container" to a student, this would minimize the amount of work needed to be done by the tool/instructor/activity to help contextualize the data.
Regarding the broken link - you are quite right, that is not longer the correct link. I apologize for that. We have recently (just last week!) updated the PhET website to forefront our suite of HTML5 sims. You can find them here: http://phet.colorado.edu/en/simulations/category/new
Best,
~ Emily