Ndigi Gichingiri Presents

Sounds Good

Improving the screen reader experience for Talkback and VoiceOver users.

Design Ops

Background

During an accessibility audit, Jocelyn and I found various issues that made My PetSafe ADA non-compliant and degraded the experience of our app for those using screen readers. Seeing as RSC had already sued for this issue before we thought our customers and our business would benefit from us taking action to remedy these issues.

My Role

I performed our accessibility audit on Android, documented the issues, presented to our developers with Jocelyn, created our annotation template, and facilitated implementation.

The Problem

Voiceover and Talkback users were hindered by components missing labels, lack of appropriate grouping, and content carryover from previous screens.

The Solution

Relevant labeling of components, including labels at the parent level, eliminating carryover and ensuring accessibility is considered at all points of our process.

No items found.

The Process

Since Jocelyn and I had already identified the issues we observed we thought our next best course of action would be determining what interactions were appropriate. We consulted Apple's Human Interface Guidelines and Google's Material Design Guidelines for recommendations and tested a variety of native apps to see how Apple and Google implemented their own recommendations.

Reporting the Results

At this point in the process members from our QA team expressed interest in hearing observations from our audit. Shortly after we presented these findings to our front end team.

We sorted our results into 5 sections in need of improvements.

Defining Next Steps

Thankfully, we found that our developers were very receptive and eager to help us tackle the issues we observed, but as we started talking next steps it quickly became clear that simply stating the issues and solutions wouldn't be sufficient. Jocelyn and I would need need to define our expectations.

Shaping the Requirements

With documentation identified as a deliverable, Jocelyn and I started our first attempt at guidelines for VoiceOver and TalkBack. Before we got into annotations, we found it necessary to identify each type of component we had in our system.

A few examples of components and other elements we use.

Once that was done we started annotating everything in Figma.

Refining our Documentation

After we met with our developers to go over the documentation we created they started implementing the solutions we proposed. However, over time it became clear that some components required more documentation.

Components without a state to report were easy to follow guidelines for.
Components with states or multiple items were interpreted as a requirement to dictate each item at once. This would be too verbose

With that issue identified we adjusted our annotations to make it clearer what the expectations were for more complex components such as the tab bar and form fields.

Global Accessibility Awareness Day(GAAD) 2022

To this day Jocelyn, myself, and the rest of the UX team continue to advocate for accessibility and in May of 2022 we hosted RSC's 2nd observance of Global Accessibility Awareness Day(GAAD). During this time I had the opportunity to present the progress we've made so far, speak to how accessibility is part of my team's process, and encourage others at RSC to prioritize accessibility throughout every step of our product development process.

Where We Are Now

Fast forward many Jira tickets later, we were able to address a majority of the issues observed and in February of 2022 we released an updated version of our app on the Google Play Store.

ScoopFree now lets users know how many rakes have occurred over the last week.

While I'm proud of what we've achieved so far the work is not done. We unfortunately were unable to test our improvements with users and as development continued some other opportunities for improvement were identified such as:

  • Color usage (some items failed WCAG contrast requirements)
  • Localization (UK vs. American English has its share of wrenches to throw)
  • Native components (using native components can make implementing/maintaining accessibility easier)

The issues observed provided the opportunity for some design system enhancements and improvements to our process. Both of which are challenges I'm very eager to tackle.

Takeaways

Before I started working on this my knowledge of screen reader accessibility was non-existent and so my first experience using one certainly presented a bit of a learning curve, but it really helped me empathize with people who primarily use one to navigate their devices. On top of that given the size of our teams and varying levels of bandwidth it at times has been challenging to maintain momentum, but this work is extremely valuable and has to be treated as such. If anything I've found a new passion that I'm gracious is part of my work and I will continue to champion the needs of the many versus the few.