One of the least promoted features in the Mavericks release of OS X, but in my view one of the most important, is the built-in switch access. Why? Well, one of the least supported group of people are those who rely on simple switch devices operated with limited gestures in order to interact with technology. Connecting switches is now less of a problem than it was with USB now a standard. Previously, hardware mods were required thus invalidating warranty, if at all possible. Also newer switch connectivity solutions for mobile devices like Tecla have arrived. Still, lack of built-in software control this has meant a limited choice of software that directly supports switches (eg a few games) or specialist third party switch access software.
As with VoiceOver, the built-in assistive technology that enables blind people to access OS X and iOS devices, switch access in these OSs means consumer kit is now more easily available to those who really need it. Plus, of course there is the usual so called ‘curb-cut’ benefit to us all when we may find it useful in unexpected circumstances (eg triggering behaviour with a wired remote control when other means are not possible).
Who uses switches and how?
People with severely restricted mobility often have limited ability to make fine and/or course gestures and so are unable to operate a mouse, keyboard or touch device with any degree of accuracy. This obviously excludes interaction with most tech these days. For these people, a range of mechanical switch devices and associated assistive technology software provide alternative access through very basic gestures. Such gestures include press/release a big button, suck or blow on a straw and in the case of Stephen Hawkins, twitch a cheek muscle to operate a sensitve switch. Another related technology is eye gaze which tracks a user’s eye movements in order to control a pointer or selection on a device
How does switch access work?
As with screen readers for the blind, software is needed to convert between expected user interface interactions and what the operator wants to use. Generally, switch access focusses on input control, though as users may also need visual support speech feedback is often also provided.
As input gestures are greatly limited the interaction ‘events; are greatly simplified, often to ‘next item’ and ‘select item’. This leads to so called ‘scanning’ when the input focus is moved from one item to the next until one is eventually selected by the user. As you can imagine this makes operation of an user interface or entering text extremely tedious! A number of scanning ‘modes’ aim to alleviate this to some degree and also match user capabilities. Such modes include ‘auto scanning’ where the cursor moves continuously and ‘group scanning’ where groups of items are selected rather than each item in turn. However, this is an area that cries out for research and innovation!
Traditionally, the user experience has been with a simple grid of items that can be selected. Movement from one item to the next in a grid provides simple linear access suitable for limited gestures Compare this to the more random access a pointer or keyboard. Grids are either a complete replacement user interface, as in the excellent Grid 2, or an overlay that lies on top of main UI, in the same manner as On Screen Keyboards (OSKs) now familiar from touch devices.
Another approach is direct in-application access, something I experimented with a few years back with Mozilla Firefox in a project called Jambu. A further, example can be found in Special Access to Windows, In this mode a selection marker (blue rectangle in the photo) is drawn directly around items in the interface, such as buttons or menu items. This indicates they have the input focus and can be selected. The movement is then between items, both across groups and within groups, providing a more immediate, less indirect user experience when compared to a grid.
You can lean more about switch access in ‘Switch access to technology’, a free pdf by David Colven and Simon Judge, available from the Ace Centre.
What have Apple added to OS X?
After a quick play with the new switch control options (available in system preferences -> accessibility) it’s clear that Apple have done an excellent job and cover most requirements. The support allows full control of the UI and text entry, plus switch access to log-in is possible. By the way, switch access landed in iOS 7 a while back and while I haven’t tried it it appears similar from the blogs I’ve looked at.
As you can see in the first photograph, I plugged in a single button to a USB port using a Joy Cable converter. Out-of-the-box, the default settings worked well providing usable auto scan once I had identified my switch (by pressing it) and assigning it an action. The space bar is treated as a switch and give an action of ‘select item’ by default. Some normal access was required to set up the options but once configure operation was perfectly possible with just a switch. Though, admittedly, I performed limited testing.
A wide range of options are available and cover differing numbers of switches and scan modes. A home panel grid overlay provides user access to a range of interaction modes. Overlays are provided for pointer control and an On Screen Keyboard. Further custom overlay designs can be built with an editor. Further more, In-application scanning is provided and works well, though I had some issues getting to it from the main panel with my single switch. Other options such as speaking items with the TTS voice enhance the experience.
So all-in-all this is another fantastic addition to accessibility from Apple. Once again, Apple have led the way on this. Windows has had basic switch support via games controllers for years, but no built in accessibility support, other than the OSK. Linux (GNOME) has had GOK and Caribu for a while but no in-application access. Let’s hope other desktop, and more importantly, portable, OSs follow Apple’s lead here.
A note for for developers
As with VoiceOver, the switch access used Apple’s Accessibility APIs to provide access and applications must be written to correctly support this and so be accessible. If you are a developer you should use best practices and either stick to built in controls, or be very clear on how to make custom controls fully accessible. The benefit is more users for you programs an apps. AT developers will find that unfortunately the Accessibility APIs are not very open so creating 3rd party AT support is as easy as it should be.