There were three recorded accessibility sessions at Google I/O this year, with three additional "office hours" with Google's accessibility experts for those lucky enough to attend. Read on for a summary of this year's events.
What's New in Android Accessibility
First up was the staple "What's New in Android Accessibility", introduced by the product manager for accessibility, Patrick Clary. He was joined on stage by Maya Ben Ari (project manager on Android accessibility) and Victor Tsaran (technical program manager) as well as Astrid Weber (UX research lead) and Melissa Barnhart (UX researcher on Android).
Patrick did the typical introduction to accessibility in general and specifically on Android.
Victor showcased the new user-facing features from the accessibility team, including:
- accessibility volume - made possible by new audio stream on Android, separate from the media stream
- fingerprint sensor gestures (used by accessibility services on supported devices as an alternative means of user input)
- multilingual support for text-to-speech engine
- accessibility shortcut - toggle accessibility service (configurable, defaults to TalkBack) with long-press of both volume keys
Maya talked about new APIs:
- continuous gesture API - it wasn't clear to me how this would be used. It looked like APIs supporting gestures apart from clicks, like drag/swipe/pinch-to-zoom
- accessibility button - this is different from the accessibility shortcut, allowing users access to a feature within an enabled accessibility service, rather than just toggling the service on and off
- Select to Speak service - like TalkBack but the user must choose explicitly which part of the screen to have read aloud. Suitable for low-vision users.
- Testing accessibility - includes Accessibility Scanner and Espresso/Robolectric automatic tests
Finally Astrid and Melissa spoke about user testing (UX research methods) for accessibility:
- definition of UX research - change your perspective to that of your users
- research methods - usability studies and intercepts (I think this is a euphemism for snaring unsuspecting participants for a quick Q & A)
- outcomes - some conclusions based on the research
Even though this was a high-level session, I was disappointed, frustrated but not altogether unsurprised at the lack of technical examples. When I downloaded the developer preview of O to try the new accessibility APIs, I opened a few tickets (unsure if there were bugs or if I was using the APIs incorrectly) but they were all closed.
I can't see the point in adding new APIs if there's no easy-to-follow documentation; there are people who are interested in developing for accessibility, and there are people who can use the limited documentation available, but the overlap of these two groups is very small.
- the fingerprint sensor gestures look cool - it would be great if someone developed a service that could emulate the old trackballs
- the biggest user benefit will be the accessibility volume (separate from media volume) and multilingual support which can switch between languages when reading aloud
- with regards to testing accessibility, focus on manual testing and (when you get confident) trying to automate these tests. Don't rely on the "automatic" tests from Accessibility Scanner or the additions to Espresso/Robolectric, but use them as a starting point to highlight issues.
Accessibility UX Insights: Designing for the Next Billion Users
Astrid from the initial session and Nithya Sambasivan (UX researcher) present the NBU (Next Billion Users) Accessibility Framework. This was pretty interesting, it felt like a TED talk.
If you're just after the actions though, you can skip forward to the summary.
Pragmatic Accessibility: A How-To Guide for Teams
This was my favourite session, with Rob Dodson presenting Accessibility for Teams. It's one of the best talks I've seen on accessibility (and in general).
One of the points Rob mentioned is something I cannot emphasise enough - just because something is accessible, it's not necessarily usable.
He suggests different responsibilities based on your role in the team:
- accessibility is a team effort - everyone has a role
- create checklists - useful for new projects and legacy projects
- find ways to automate your tests (blogpost coming soon for Android from Novoda)
- watch the whole video!
And that's all the accessibility related news from Google I/O 2017, let me know what you think on Twitter.