Accessible Products: Beyond Code

In the field of creating accessible products and experiences, I have found that we are behind when it comes to one group in particular: individuals with Cognitive Disabilities. As a Digital Accessibility Specialist with Cognitive Disabilities myself, I feel confident addressing this gap. I’ve been doing a lot of research and speaking about this subject, and along the way I’ve come across misconceptions about what it takes to create and maintain accessible products. Among other things, these misconceptions are perpetuated by a lack of disabled voices in the field, and I aim to clear them up.

Compliant Code Does Not Mean Accessible Products

First I’d like to dispel this common belief: if your website’s code is compliant, you have an accessible site. I’m here to say for someone whose brain works like mine, that couldn’t be further from the truth. Let’s take a moment to discuss the impact of code vs. design on Cognitive Accessibility starting at the structural level.

The four main roles involved in an application lifecycle are:

  1. User Experience Research (UXR)
  2. Design
  3. Development (Dev)
  4. Quality Assurance (QA)

These four different groups of people ask themselves two very different questions. The first question, asked by UXR and Designers, is “was it designed accessibly?”. The second from Development and QA is “was it coded accessibly?”. As an Inclusive Designer, I have no control over the answer to the second question. I have to trust that the other members of the development lifecycle are doing their part. With all of these different roles all handling important parts of the process, why is it that we spend so much time focused on Development? There are countless websites, endless blog posts, and full conferences based around accessible code. Now don’t get me wrong, this is a critical piece of the puzzle, but it’s not the only one. Even if each piece of the site is coded accessibly, when you put individual accessible puzzle pieces together there’s no guarantee it’s going to make a picture. That’s where a designer comes in.

Code that is based on inaccessible wireframes and experience will not result in an accessible product. You can never out-code bad design. The most accessible code possible oftentimes doesn’t mean anything for the experience of a user with cognitive disabilities. There are a few caveats that we’ll return to later, but in general, Inclusive Design is where Cognitive Accessibility comes into play.

Inclusive Design

Let’s define that term: Inclusive Design per Microsoft is “a methodology, born out of digital environments, that enables and draws on the full range of human diversity. Most importantly, this means including and learning from people with a range of perspectives”. The importance of this methodology is about designing for all people, as it draws from human diversity. Rather than thinking of usability, we need to think about inclusion and how to design with all people in mind. This point is critical for me because the barriers I experience with my ADHD are due to purposeful design decisions. The barriers that block me from doing my job, from getting my bills paid, or from succeeding are ones that were decided, picked, and put in place by individuals who did not understand or think about Cognitive barriers.

As someone with ADHD, I like to explain that it’s not the inability to focus, it’s the inability to not not focus. How my brain works means that spelling is difficult, focus is a challenge, numbers are nonsense, I experience time dilation, I’m sensitive to light and sound, and I deal with impulsivity. None of this means that I, or any other disabled individual, can’t succeed when products are designed with us included. But it does mean I am susceptible to barriers that are created by the mentality that “Minimal viable products” or MVP’s don’t have to be as ‘usable’. For me, things that make a product ‘usable’ are actually make or break, especially when ability is always changing.

Spoon Theory

When we’re thinking about creating experiences for someone with Cognitive Disabilities, we need to keep in mind that ability is a sliding scale. With that in mind, something we don’t think about enough is how disabilities are not binary, cut and dry, or the same hour by hour. This idea is ingeniously captured by Spoon Theory, coined by Christine Miserandino. Spoon Theory is a concept used by the Chronic Illness community to explain energy usage using spoons as the unit of measurement. The theory states that individuals are given 10 spoons to use throughout the day, regaining them by sleeping. It costs Disabled individuals more spoons to exist in an inaccessible and fundamentally ableist society, and they get fewer spoons back from rest. For example, for someone with Chronic Pain, every movement causes pain and it takes more emotional and physical energy to be a human. For them, waking up, getting breakfast, and leaving the house costs three spoons. The rest of their workday costs four spoons and one more from transit. When they arrive home, they only have two spoons left to do laundry, pay bills, cook dinner, and prepare for the next day. Whereas someone without chronic pain could still have five spoons when they return home to use on their remaining daily tasks. On top of having fewer spoons, someone with Chronic Pain also gains fewer spoons from resting so they quickly find themselves in a spoon deficit. This system applies to Cognitive Disabilities in a similar fashion.

This theory describes the amount of energy it takes to both physically and emotionally exist as a disabled person in a society that is not built for us. When we’re talking about sliding scale of abilities, it’s important to understand that what somebody can do in the morning when they have all of their spoons versus at the end of the day when they have none are vastly different. The barriers I find difficult at 11 am are oftentimes the ones that leave me in tears at 10 pm. We need to embrace the idea that when it comes to disabilities, it isn’t always a matter of “can” and “can’t”; it’s an ever-shifting, evolving, complicated scale affected by an infinite number of factors. To create truly accessible experiences with Cognitive Accessibility in mind, we have to think about users with no spoons. Hell, even users with a spoon deficit because inaccessible design fails are just as harmful as those found in code.

WCAG and Cognitive

As someone who works to make sure products are compliant as well as someone who gets left out of the standards, I have a love-hate relationship with the Web Content Accessibility Guidelines (WCAG). Accessibility in design is so much more than color contrast, but that’s sadly where some conversations about design-related accessibility end. When looking for WCAG criteria that are specifically written for people with Cognitive Disabilities, the list is pretty short. I have six I count that are written for Cognitive first. WCAG has a lot of gaps when it comes to Cognitive Accessibility because it’s incredibly complex. We oftentimes see fewer Cognitive-related criteria because creating a testable solution for one group of users can oftentimes create a barrier for the next. In the end, we are left with gaps for Cognitive Accessibility leading to sites that are technically compliant still being inaccessible to many users. Conformance to the standards does not ensure an accessible site. With all of these gaps for cognitive and knowing that meeting standards is a baseline, we need to think more about our users and create products and experiences that include everyone. Thankfully that’s where the Conformance Requirement of Non-Interference steps in.

Non-Interference

Non-Interference more or less states technologies that are not accessibility-supported can be used as long as:

  1.  All the information is also available using technologies that are accessibility supported
  2.  The non-accessibility-supported material does not interfere.

What this means, boiled down, is to make sure one way or another, things are able to be accessed and they don’t mess with assistive technology.

Now, when people read assistive technology (AT) normally we think about Screen Readers, Screen Magnifiers, and potentially alternative mice/keyboards. I would like to challenge what we think of as AT as well as the way we conceptualize AT users.

If the list above is all we think of as Assistive Technology, then we are missing out on huge use cases. For starters, people with Cognitive disabilities use assistive technology outside of what would be expected such as Screen Readers, Captions, and Speech Recognition Software. Beyond that, the types of technologies leveraged by someone with language processing related disabilities to help them communicate are also Assistive Technologies. For example, someone with dyscalculia and dyslexia uses spell checker, Speech-to-Text, copy and paste, and Screen Readers. When thinking about ADHD, things like read-only mode, ad blockers, autocomplete, and captions are all things that are critical. If our definition of assistive technology is too narrow when thinking about Cognitive Disabilities then these users aren’t being understood or appropriately represented in decisions being made in design. We simply need to realize, usability concerns are actually Cognitive Accessibility concerns. 

Examples

Sensory and Distraction

Assistive Technologies:

  • Read-Only Mode
  • Ad Blocker
  • Auto-complete
  • Captions

There are many Cognitive Disabilities that are adversely affected by moving, blinking, or distracting design patterns. Such disabilities include, but are not limited to, ADHD and Autism. Patterns that move such as carousels, micro-interactions, autoplay, advertisements, and others can be anywhere from bothersome to a barrier depending on the spoon level.

Things like Autoplay, a pattern that grew in popularity over the past few years thanks to content-sharing sites such as Facebook and Twitter, can be a full blocker for some individuals with ADHD. The issue occurs when content that is moving is played in parallel with other content without a way to pause, stop, or hide it. It can be found in a myriad of apps in the form of gifs, advertisements, or video content. This is true for interactions such as Netflix’s home page. When the home page is navigated to, a full-screen trailer for a new, upcoming, or popular show begins to play automatically with sound enabled. Even with the inclusion of a mute button, there is no way to stop the movement, derailing users with ADHD and oftentimes making the site unusable. But autoplay content oftentimes does have a way to pause it. Videos that play at the top of news articles or recipe sites. Breaking the train of thought from an individual with attention-related disabilities takes only a second. So while these examples do technically pass standards, they make for bad experiences full of barriers. Thankfully some applications such as Linkedin, Twitter, and recently Pinterest have included a setting to turn off auto-playing content. The hope is that other products will soon follow suit.

Language Processing

Assistive Technologies:

  • Captions
  • Transcripts
  • Pause and Rewind function

The ability to process and decipher language isn’t a disability often talked about but according to the National Institute on Deafness and Other Communication Disorders (NIDCD), anywhere between 6 to 8 million people in the US have a language-related disability. It affects people with Autism, ADHD, Dyslexia, and individuals with Epilepsy. Millions of people struggle with both the ability to quickly take in written language and turn it into information and to hear spoken language and process it. This is where captions, using text and image in parallel, and proper content structuring comes into play.

It’s well known that Captions are used by people of all abilities but there are millions of individuals who are hearing and rely heavily on written Captions in parallel with spoken word to enjoy video content. This can be true for transcripts of Podcasts as well. These individuals, myself included, who rely heavily on captions to help process spoken words are especially affected by bad caption work, also known as Craptions. When captions are paraphrased, reworded, censored, or poorly timed, the individuals who can hear the spoken word are forced to try and parse through both spoken and written language to try and determine what is the correct information. It doubles the cognitive load and can create a real barrier between them and their understanding of the material. The rise in popularity of auto-captions has helped bring access to videos that have no captions, but it’s not enough for both hearing and Deaf individuals.

Along with captions, keeping text labels alongside icons is a pattern that can be the safety net keeping Low Spoon individuals from making risky errors. The common pattern of removing labels from icons to save real estate in smaller breakpoints transfers the responsibility to the user to learn and memorize what very small icons mean. On the other hand, icons are great visual representations that can help bring meaning to written labels. Having things like an icon of a floppy disk next to the word ‘Save’ can be incredibly meaningful to someone who isn’t sure they read the word correctly. If icons have labels and vice versa they should regardless of window size. Breakpoint shouldn’t determine access to information.

Working Memory

Assistive Technologies:

  • Password Management Tool
  • Autocomplete
  • Copy and Paste

Cognitive Load and working memory go hand in hand when talking about creating usable design patterns. When asking users to remember complicated workflows, ambiguous iconography, and especially form related content, the groups of individuals with or experiencing limited working memory are at higher risk for errors. In order to create experiences with these users in mind, designs should do all they can to take cognitive load off of the user. For example, using patterns such as reveal password toggles, autofill form fields, remember me options, and proper inline error handling. Oftentimes viewed as ‘nice to haves’ these types of features can create or remove barriers for individuals with limited working memory.

The rise in popularity of the option to unmask passwords is an exciting one for multiple reasons. This gives users the option to catch mistakes before triggering an error, the ability to pick up where they left off and leaves them in better shape to get the password correct if in the end, it does trigger an error. One better than having a password reveal itself is allowing users to not have to enter their password in the first place. ‘Remember Me’ patterns that aim to make login easier for users returning from familiar devices by allowing them to opt to remain logged in. Third-party services such as Google and LastPass are also leveraged to store passwords so that people with limited memory don’t have to remember them to begin with.

The same can be said for one time codes sent via text that auto-populate in apps such as Uber. Removing the step of entering in a code or at a minimum allowing users to paste in the numbers saves spoons and protects users from harmful lockout situations. Patterns like this and inline error validation are critical, especially when we’re talking about inputting data such as dates or phone numbers that can be given in seemingly endless formats. Having proper instructions in tandem with robust error suggestions is the way to go in order to take the Cognitive Load off of users.

Conclusion

When it comes to creating accessible products and experiences for users with Cognitive Disabilities it’s important to remember:

  • All roles in an application life cycle are important when making accessible products. No designer can out-design bad code and no developer can out-code bad design.
  • Ability is a sliding, non-binary scale and needs change. If we design for users with lower spoon levels, we will be creating products that are more usable and more accessible to people with a wide variety of Cognitive Disabilities.
  • Design has the biggest impact on users with Cognitive Disabilities. We have to think about users with disabilities that limit attention, working memory, and language processing when thinking about how we build a product.
  • Usability is often just Accessibility for Cognitive. Things that are often thought of as “nice to haves” can be Assistive Technologies for People with Cognitive Disabilities.
  • The best way to break down barriers created in products is to listen to Disabled users, hire Disabled people, and do user research with Disabled people.
Shell Little

About Shell Little

Shell Little is an Accessibility Specialist at Wells Fargo DS4B, on their Accessible User Experience (AUx) team, where she is the Mobile and Inclusive Design Lead. She has a BA in UX, with a focus in Accessibility, and is passionate about disability rights and game accessibility.

Outside the office, Shell is a proud animal parent to a large menagerie, hobbyist streamer, and overall Neurodivergent dork.

2 comments on “Accessible Products: Beyond Code”

  • Thank you so much for this article. As a blind developer and accessibility specialist, this stuff is incredibly valuable to me.
    You are absolutely correct, far too little attention is given to cognitive disabilities and it is something I know relatively little about, something I really want to improve on in the coming time. I, for example, thought that an autocomplete widget might actually be more distracting than it is helpful, but I can also see that it might help you complete a thought or process before you lose track of it due to some distraction that might even be on the same page …is that at all accurate?
    In any case thanks again, and ironic as it may be you did open my eyes to just how big of an issue this is. Happy holidays 🙂

  • Shell, thanks so much for this piece! As a chronically ill writer, it’s so refreshing to hear someone bring the Spoon theory into design education. I appreciate the specificity of your usability recommendations according to different versions of cognitive disability.

Comments are closed.