By Taliesin L. Smith & Emily B. Moore
The PhET Interactive Simulations project has created over 150 interactive science and mathematics simulations (sims) for learning STEM topics – science, technology, engineering, and mathematics. These sims are available online for free from the PhET website. This article is the second of a two-part series on accessibility of the PhET simulations. In the first article of this series (Moore, 2016), we introduced the PhET sims and how they can support student engagement. In this second article, we highlight sim features that can support use by students with diverse needs, and introduce new features designed to support students with disabilities. We also discuss topics to consider when using accessible simulations in lesson plans.
Accessibility of PhET Simulations
A primary goal of the PhET project is to create sims that are as accessible as possible – to everyone. Our prior efforts to maximize the accessibility of PhET sims have focused on ensuring wide availability and ease of access. To support access for teachers and students around the world, we share the PhET sims for free through the PhET website, and also license each sim as an Open Educational Resource (Hylén, 2006; Smith, 2009). This means that all schools, teachers, and students can use PhET sims as little or as much as they want, at no cost. The sims can be run online or downloaded for offline use, particularly useful features for those that cannot download to the device they are using or groups or communities without consistent internet access. These features ensure that low-income and remote communities have access to the sims.
To support access, sims can be run on a broad range of devices – for example, laptops, tablets, and mobile phones. The result is tremendous flexibility for teachers and students. A student who may not have a computer at home might use a mobile phone to complete a homework assignment that includes a sim. A teacher in a classroom that has tablets for each student can do group or individual activities using the sims.
We also provide the sims in 86 different languages. To do this, we created a translation tool (Adams et al, 2012) that allows volunteers to translate the text within the sims into their native language. Use of the translation tool and the work of many dedicated volunteer translators has resulted in worldwide use of PhET sims, including use by many people who do not speak or read English.
The wide availability and ease of use provided by the PhET sims have contributed to making the sims highly used, and has also contributed to making the sims flexible and adaptable within many learning environments. Over the last decade of the PhET project, the technology used to create PhET sims has evolved and the project has begun utilizing new technologies (Hickson et al, 2014; Craig & Cooper, 2014; King et al, 2016) to increase access. It is now possible to take the accessibility of the sims to a new level, so that more students – including students with disabilities who require alternate access – can experience and benefit from the sims (Moore & Lewis, 2015).
Broadening Accessibility of Simulations
In 2013, the PhET project transitioned from developing sims in Java and Flash (two technologies with limited accessibility infrastructure and device compatibility) to developing sims in HTML5 (the new standard for digital educational resources). With the transition to an HTML5 infrastructure, we aimed for each new sim to be born accessible – meaning that they are designed and developed with accessibility features from the start. Addressing accessibility from the beginning is often much more successful and functional than trying to add accessibility features to an inaccessible sim or application.
In order to achieve the goal of born-accessible PhET sims, we had to address a number of technical challenges. Typical webpages created in HTML5 support assistive technologies (such as screen reader software) in communicating with webpages and providing accessibility information to users (e.g. identifying headings, links, and buttons, reading alternate text of pictures, etc.) (Garaventa, 2013; WebAIM, 2014; Watson, 2015). International collaborations have resulted in standards and resources to support web developers in making HTML5 webpages as accessible as possible to assistive technology. However, highly visual and interactive resources like the PhET sims are more complex in their structure than typical webpages. Unfortunately, there is not currently a standard way to support communication between a sim and assistive technologies. To address this challenge, we built a new component into the accessible HTML5 PhET sims that can communicate with assistive technologies dynamically as students interact with the sim. This new component runs parallel to the visual aspects of the sim, and acts as an invisible webpage-like structure that provides updated accessibility information. This accessibility information reflects real-time changes to the sim and is communicated to the student through their assistive technology, so that they can experience what is happening in the sim as it is happening.
With this new technological infrastructure in place, we are able to add features to the sims that increase their accessibility for students with disabilities. Prior to this work, students with visual disabilities or certain mobility impairments were unable to independently use the sims. By adding features, including keyboard navigation, auditory descriptions, and sonification, we can increase the accessibility of the sims and improve the learning experience for all students. These three features will be discussed in the examples below.
As we work towards incorporating accessibility features into the PhET sims, we have made prototype sims available with keyboard navigation, auditory description, and sonification. You can try out these accessible prototypes at Accessibility page of the PhET website. Specifically, accessible versions of both John Travoltage and Balloons and Static Electricity sims can now be fully navigated using the keyboard, and have auditory descriptions that make them accessible with screen reader software. In addition, the accessible John Travoltage prototype includes sonification features. We are in the process of refining the design of the accessibility features within these prototypes before releasing completed versions. The design of these final versions will be informed by research with, and feedback from, users with disabilities who use assistive technology (Hung, 2016; Smith, Lewis & Moore, 2016). We anticipate publishing multiple completed PhET sims with this collection of accessibility features in the spring of 2017.
Keyboard navigation provides access for students who cannot use a mouse, touch screen, or trackpad. For example, keyboard navigation can be utilized by students with mobility impairments who use keyboards (including on-screen keyboards or alternative keyboards) or other assistive technologies that can operate as a switch (e.g., customized buttons, joystick, sip-and-puff device) (Sloan & Horton, 2014). Students with vision loss or blindness also frequently rely heavily on keyboard navigation.
Keyboard Navigation in John Travoltage
Introduced in Part I of this series (Moore, 2016) and shown in Figure 1, the John Travoltage sim can be used to explore topics related to static electricity. When the sim opens, John is standing on a carpet next to a door. John’s leg and arm can be moved. Moving John’s leg collects negative charges from the carpet, and moving John’s arm allows these charges to be discharged when his hand gets close enough to the doorknob.
Figure 1. Screenshot of the John Travoltage: As John’s leg moves back and forth, his foot rubs on the carpet and negative charges build up in his body. The pink circle around John’s leg shows that the user is currently interacting with John’s leg using keyboard keys.
Rather than moving John’s leg or arm by dragging with a mouse (or with a finger on a touch screen or trackpad), keyboard navigation allows students to move John’s leg or arm using the keyboard keys or other assistive technology (e.g. switches). With keyboard keys, students can navigate to John’s leg and then use the left and right (or up and down) arrow keys to change the position of John’s leg. As students interact with John’s leg, or any other interactive objects in the sim, the object is highlighted with a brightly colored outline. These colored highlights show the keyboard focus and support students in knowing what on-screen objects are interactive, and what object they are currently interacting with (Horton & Sloan, 2014).
Auditory descriptions are text-based descriptions of images, videos, and in our case, sims. They are embedded into our accessible infrastructure so that students who cannot see the sim screen can listen to these descriptions when using a screen reader. In addition to the text, the screen reader also conveys other structural information: for example, whether the text is a heading or an item in a list or an interactive object (like a button). This combination of description and structural information provides non-visual access (Garaventa, 2012 & 2013; WebAIM, 2014; Watson, 2014 & 2015) to science topics that are typically considered highly visual in nature, and provides new opportunities for students with vision loss or blindness to engage in science learning.
Auditory Description in Balloons and Static Electricity
Introduced in Part I of this series (Moore, 2016) and shown in Figure 2, the Balloons and Static Electricity sim is used to explore more advanced topics in static electricity such as transfer of charge, attraction, and repulsion. When the sim opens, three large and brightly-coloured, interactive objects – a yellow balloon, a sweater, and a wall – appear onscreen. Each of these objects have positive and negative charges, shown as pairs of negative and positive charges, with negative charges represented by blue circles with minus signs and positive charges represented by red circles with plus signs. Less prominent interactive items give an implicit hint that other layouts are possible in the sim, including buttons showing one balloon or two balloons and a button with the text “Remove Wall.”
When a student using a screen reader first opens the sim, the title of the sim is read aloud to the student. Then, the screen reading software continues to read the (visually hidden) text-based descriptions on the page from top to bottom, unless the student presses keyboard keys to stop the screen reader or to listen in a different way. Students can use keyboard keys to navigate the sim, and depending on the keys selected, can listen to details line-by-line or skim the descriptions. For students listening line-by-line, a Scene Summary located at the beginning of the sim’s auditory descriptions provides an overview, or big-picture view, of the entire state of the sim. The Scene Summary ends with a hint to play with the balloon. Students who want to skim the content of the auditory descriptions can skip paragraphs and listen only to headings or interactive objects. By carefully wording and ordering the headings, we can design an outline for the sim that is story-like and action-oriented. Auditory descriptions encourage students to interact regardless of their path of exploration, much like the bright visual objects encourage students who can see get started with the sim.
When interacting with an object such as the balloon, students hear different descriptions depending on the stage of interaction and what is happening while interacting. When they first grab the balloon, they hear a full detailed description of the balloon (name, position, and charge), and hear instructions on how to move it. When they drag the balloon, they hear very short descriptions, indicating only the direction and destination, as shown in Figure 2 (e.g., “Left. Closer to sweater.” or “Near sweater.”), but there is more detail once “On sweater,” because a new event (i.e., the transfer of charge) takes place. Context and history is accounted for in the description provided to students.
Figure 2. Screenshot of Balloons and Static Electricity: Auditory descriptions (e.g. “Left. Closer to sweater,” “Near sweater,” “On sweater. Yellow balloon picks up negative charges from sweater.”) are read out with screen reading software as a user moves the yellow balloon towards the sweater with keyboard keys.
PhET sims are designed to support learning through exploration. Students can use each sim in a variety of ways, and progress through the sim taking many different (though equally beneficial) pathways. Through well-designed auditory descriptions (i.e., descriptions that are not overly interpretive or directive), students accessing the sim through this feature can also be supported in learning through exploration and follow their own unique pathway through the sim. Auditory descriptions must make sense when accessed in any order, and need to immediately connect students’ actions with auditory feedback. Our approach – a skimmable outline, a big-picture view, and short descriptions during interaction – allows free exploration and supports diverse navigation styles for students with vision loss or blindness.
Sonification is the use of sound, such as musical tones, to represent data (Brown et al, 2003; Hermann, Hunt & Neuhoff, 2011). In PhET sims, a single action often results in multiple changes within the sim. For example, in the Balloons and Static Electricity sim, rubbing the balloon on the sweater results in a change of the net charge on the two objects and a change in the position of the balloon. Sonification can be an effective way to provide non-visual feedback of multiple changes that happen at once.
Figure 3. Screenshot of John Travoltage: John gets a shock as negative charges rapidly discharge through his hand to the doorknob. Three different sounds are used to communicate changes as the user interacts (e.g. sound representing hand position increases in pitch as user moves John’s hand closer to the doorknob; when close enough charges lineup rapidly and discharge through hand to doorknob).
Sonification in John Travoltage
In the John Travoltage sim introduced earlier and shown in Figure 3, relative phrases (such as “far,” “close,” and “very close”) can be used to describe the distance between John’s hand and the doorknob, but with many possible hand locations these phrases do not fully capture the range of motion available. When sound is mapped to all possible hand locations, more detailed feedback of the distance between John’s hand and the doorknob can be communicated to the student. As a student moves John’s arm, they hear a tone indicating the location of the hand. The sound changes in pitch as it approaches the doorknob (Figure 3). Also in the John Travoltage sim, the representation of the amount of charge acquired in John’s body as the foot is rubbed on the carpet is designed to be visually attracting to the eye – using bright blue moving circles. Depending on the scenario students have arranged in the sim, the charges may move around slowly and randomly, or the charges may rapidly line up and move through John’s arm to be discharged to the doorknob. The amount of charge can be expressed in words using relative or exact terms (Hung, 2016), but the motion and organization of the charges is more difficult to describe in words. Sound can convey the relationship between the number of charges, their motion, and the hand’s proximity to the doorknob in a way that represents key aspects of what is happening as vividly as in the visual representation.
Accessibility Benefits Everyone!
The goal of providing multiple accessibility features (that can be used together or separately) is to make sims accessible for students with disabilities. However, as with universal design and universal design for learning1, including these features is likely to be beneficial to all students. For example, auditory descriptions may help students who are learning in a language other than their native language. Sonification can also provide a more immersive experience for all students and be particularly beneficial for students who have trouble interpreting the visual changes alone. Finally, our implementation of keyboard navigation allows students to alternate between mouse and keyboard interaction, supporting collaborative learning amongst students with diverse needs.
Tips for Teaching with Accessible PhET Simulations
For teachers planning to utilize these types of accessibility features within interactive learning resources, here are a few tips to consider:
- Familiarize yourself with the accessibility features available. The PhET sims with accessibility features will be accompanied by teacher professional development materials to support teachers in learning the capabilities of the accessibility features, along with ideas for classroom use.
- Ensure that students are able to use the accessibility features. Particularly for students utilizing accessibility features out of necessity, check in with students before using the sim in an activity to ensure that a students’ assistive devices are compatible with the sim. Consider who else needs to be involved from the educational team (e.g. parents, assistive technologists, learning resource or special education resource teachers, itinerant vision resource teachers, etc.).
- Consider how the accessibility features might support students with and without disabilities. For example, students without disabilities may benefit from auditory descriptions to support learning vocabulary (including general vocabulary and science-specific vocabulary), and sonification may provide non-visual reinforcement of visual trends and relationships. Sonification may also be particularly beneficial to complement lecture materials when the sims are used as lecture demonstration tools. In some situations, keyboard navigation may provide more efficient access to interact with the sim than a mouse or trackpad and also provides a foundation in keyboard shortcuts, which are useful to students of all abilities. When orienting the class to sim features, consider presenting the accessibility features along with other features of the sim, without specifically calling them out as for the purpose of “accessibility.” For example, when students first start using a sim, you can let them know that “you can interact with the objects using your mouse, trackpad, or keyboard shortcuts”). Many “accessibility” features are becoming integrated into everyday technology and normalizing different tools and technologies is important in promoting inclusion and reducing stigma.
- Consider how the accessibility features may present new opportunities for discussion of content. Students with disabilities and students without disabilities can all use the sims together. Having students accessing the sim through different, though shared means, can result in new ways of discussing – and learning – the science content (and the concept of accessibility!).
- Provide feedback or encourage your students to provide feedback if you run into any accessibility issues. The PhET project values feedback from students, families, and teachers as we continually improve the sims.
Students of all abilities should have the opportunity to study what they find interesting. The general availability and ease of use of PhET’s science and mathematics sims has been beneficial for many students and teachers around the world. With the addition of new accessibility features such as keyboard navigation, auditory descriptions, and sonification, accessible PhET sims make it possible for students with vision loss or blindness to learn with the sims. The keyboard navigation that has been introduced to the accessible sims also broadens access for students with some mobility impairments. Finally, just like universal design, adding multiple accessibility features – keyboard navigation, auditory descriptions, and sonification – expands learning options for all students.
To try out PhET’s accessible prototypes or find out more about PhET’s research on accessibility, please visit the Accessibility section of the PhET website (https://phet.colorado.edu/en/accessibility).
1. To learn more about universal design and universal design for learning, you may be interested in reading the SNOW Feature “Universal Design, Technology, and Education” (Borges, 2011) and visiting the webpage of the National Centre on Universal Design for Learning from CAST, the Center for Applied Special Technology (National Center on Universal Design for Learning, 2014).
Taliesin L. Smith is a front-end web developer by trade and has been assisting with the design of online courses at Memorial University since 2009. Her passion for web accessibility, in general, and accessibility in online education, in particular, led her to complete a Masters of Design in Inclusive Design at OCAD University. Her Masters research project focused on the research and design of accessibility features to provide non-visual access for the PhET Simulation, Balloons and Static Electricity.
Emily B. Moore is Director of Research and Accessibility for the PhET Interactive Simulations project at the University of Colorado Boulder. Dr. Moore conducts research across middle school and undergraduate levels on topics including simulation design, student use of simulations, and teacher facilitation strategies with simulations. She also leads research and development efforts to increase the accessibility of PhET simulations, which includes recent work in the design of auditory description, keyboard navigation, and sonification to support non-visual access to simulations.
Adams, W. K., Alhadlaq, H., Malley, C. V., Perkins, K. K., Olson, J., Alshaya, F., … Wieman, C. E. (2012). Making on-line science course materials easily translatable and accessible worldwide: Challenges and solutions. Journal of Science Education and Technology, 21(1), 1–10. https://doi.org/10.1007/s10956-010-9275-y
Brown, L. M., Brewster, S. A., Ramloll, S. A., Burton, R., & Riedel, B. (2003). Design guidelines for audio presentation of graphs and tables. Proceedings from 9th International Conference on Auditory Display (ICAD) (pp. 284–287). Boston, MA, USA. Retrieved from http://eprints.gla.ac.uk/3196/
Borges, C.D. (n.d.) Universal Design, Technology, and Education. Retrieved from http://www.snow.idrc.ocad.ca/node/226
Craig, J., & Cooper, Michael. (2014, March 20). Accessible Rich Internet Applications (WAI-ARIA) 1.0 [W3C Web Standard]. Retrieved February 12, 2016, from https://www.w3.org/TR/wai-aria/
Garaventa, B. (2012, July 15). Why keyboard accessibility isn’t the same thing as screen reader accessibility [Professional Social Networking]. Retrieved from https://www.linkedin.com/groups/4512178/4512178-134539009
Garaventa, B. (2013, January 2). How browsers interact with screen readers and where ARIA fits in the mix – SSB BART Group [Company Blog]. Retrieved January 11, 2016, from http://www.ssbbartgroup.com/blog/how-browsers-interact-with-screen-readers-and-where-aria-fits-in-the-mix/
Hermann, T., Hunt, A., & Neuhoff, J. G. (Eds.). (2011). The sonification handbook. Berlin: Logos Verlag.
Hickson, I., Berjon, R., Faulkner, S., Leithead, T., Doyle Navara, E., O’Connor, E., & Pfeiffer, S. (2014, October 28). HTML5 [W3C Web Standard]. Retrieved April 10, 2016, from https://www.w3.org/TR/html5/
Hung, J. (2016, September 14). PhET John Travoltage simulation design [Project Wiki]. The Fluid Project. Retrieved September 15, 2016 from https://wiki.fluidproject.org/display/fluid/PhET+John+Travoltage+Simulation+Design
Hylén, J. (2006). Open educational resources: Opportunities and challenges. Proceedings of Open Education, 49-63.
King, M., Nurthen, J., Cooper, M., Bijl, M., Scheuhammer, J., Pappas, L., & Schwerdtfeger, R. (2016, March 17). WAI-ARIA authoring practices 1.1 [W3C Web Standard (Working Draft)]. Retrieved October 3, 2016, from https://www.w3.org/TR/wai-aria-practices/
Moore, E.M., (2016). The PhET Interactive Simulations Project: Working to increase access to interactive STEM simulations [Part 1 of 2]. SNOW, Education, Access and You! Retrieved October 3, 2016 http://www.snow.idrc.ocad.ca/node/263
Moore, E. B., & Lewis, C. (2015). Opportunity: Inclusive design for interactive simulations. In ASSETS ’15 Proceedings of the 17th international ACM SIGACCESS conference on Computers and accessibility (pp. 395–396). Lisbon, Portugal: ACM Press. https://doi.org/10.1145/2700648.2811387
National Center on Universal Design for Learning. (2014, July 31). What is UDL? Retrieved from http://www.udlcenter.org/aboutudl/whatisudl
Sloan, D., & Horton, S. (2014, October 8). Why keyboard usability is more important than you think. Retrieved from http://www.usertesting.com/blog/2014/10/08/why-keyboard-usability-is-more-important-than-you-think/
Sloan, D., & Horton, S. (2014, October 8). Designing better keyboard experiences. Retrieved from http://www.usertesting.com/blog/2014/10/08/designing-better-keyboard-experiences/
Smith, M.S. (2009). Opening education. Science, 323(5910), 89–93. Retrieved October 3, 2016 http://www.jstor.org/stable/20177128https://doi.org/10.1126/science.1168018
Smith, T. L., Lewis, C., & Moore, E. B. (2016). A balloon, a sweater, and a wall: Developing design strategies for accessible user experiences with a science simulation. In M. Antona & C. Stephanidis (Eds.), Universal Access in Human-Computer Interaction. Users and Context Diversity (Vol. 9739, pp. 147–158). Cham: Springer International Publishing. Retrieved from http://link.springer.com/10.1007/978-3-319-40238-3_15
WebAIM. (2014, November 19). Designing for screen reader compatibility [Blog]. Retrieved from http://webaim.org/techniques/screenreader/
Watson, L. (2014, September 21). Understanding screen reader modes [Blog]. Retrieved from http://tink.uk/understanding-screen-reader-interaction-modes/
Watson, L. (2015, March 16). Accessibility APIs: A key to web accessibility. Smashing Magazine. Retrieved from http://www.smashingmagazine.com/2015/03/16/web-accessibility-with-accessibility-api/
Published May 2018