Why is an Agency the Perfect Place to Observe and Improve Digital Accessibility?

I work for Code and Theory (C&T), a digital-first creative agency headquartered in New York. C&T is an exceptionally iterative company—creating products and experiences at a rapid pace for clients. In the last week alone, I have researched and implemented test solutions for a hospitality company, a financial tech corporation, a hospital system, a startup, an internal project, and a fashion entity. 

C&T has a diverse client pool, but we tend to observe similar challenges across all industries. A great example is how so many clients are trying to handle accessibility but are siloed by their differing verticals. Regardless of its type or size, most businesses have the same concerns about implementing accessibility. As a result, we’re tasked with trying to problem-solve whatever comes to us in a platform-agnostic way.

When I started at C&T five years ago, clients were questioning the need for accessibility altogether. Occasionally, there would be a client-side compliance department that gave a cursory review of the major website landing pages that we designed. If the client didn’t know about an internal accessibility compliance requirement, production releases would be held and remediation costs would increase. Over the course of time, the Section 508 Refresh became a reality—requiring Federal agencies to make their online presence accessible to people with disabilities. Finally, the private sector became more intertwined with accessibility compliance validation.

At C&T, we integrated a basic WCAG A level minimum into our web development Scopes of Work (SOWs) by default. This was put into effect after evaluating the baseline standards of the development team. The new contract language produced a lot of conversations with our clients. Many were evaluated to the WCAG AA level as an expectation, based on their visibility with the public and/or their liability needs. Clients were preparing themselves for including WCAG AA as the business requirement expectation—and they expected us to implement in a capable manner.

Today, the majority of our clients have a mature outlook on accessibility. Many have their own capable accessibility validation teams. Others are working with us to create the specialization within their organization. Our clients are becoming more educated in the cost implication of leaving accessibility as an afterthought—not only with the obvious lawsuit implications but with the effort involved in making revisions to a project that is 95% complete. They are coming to us with the knowledge that other companies are facing digital accessibility-related litigation—and its fallout. 

Cost Increment Over Development Timeline

Earlier in 2019, I studied the cost implication for correcting a color palette that doesn’t pass accessibility standards. As expected, the financial impact of making the change in the early stages of a project turned out to be far less severe than, say, rushing through the change via a client’s legal compliance department two weeks before launch. 

Strategy/Product Definition Period: 6 Hours

It takes approximately 6 hours to determine and validate if a color palette is accessible before a design system is executed. In a recent project, we discovered that a brand color palette from a particular client partner was problematic on a white background. The strategy team brought forward the concern to the client’s marketing team, spurring conversation prior to starting on the design system.

Design Period: 27 Hours

If the need for a color palette change is determined when a project is already in its design phase, the financial burden increases. It averages 27 hours to make adjustments to the design system and updates throughout the documentation. In this case, the effort is generally focused on ensuring the color palette consistently across all project deliverables. However, there’s also the expense of leveraging a Quality Assurance (QA) Engineer accessibility specialist to review the changes. Design revisions may also require additional discussions with a client who has already fallen in love with the look and feel of the work.

Development Period: 36 Hours

If a developer determines that a color palette combination is inappropriate prior to coding, they must collaborate with the designers and the client to confirm the change. This averages about 36 hours of revisions. The number increases as the team gets closer to launch.

Quality Assurance: 75 Hours

When a QA Engineer uncovers a color palette issue, the costs for updating tests and cross-validating other features for consistency increase further. The key here is validating consistency across the entire ecosystem. Once you factor in developer remediation, documentation, and design/client collaboration, this averages to be 75 hours of effort during the latter parts of product development.

Legal Compliance: 157+ Hours

Should an issue be uncovered when in legal remediation of any type, the costs skyrocket—averaging about 157 hours of remediation effort from all counterparts. (That said, it’s a lot less expensive than the cost of halting development for new features.) This number greatly increases if the team is under lawsuit duress.

When determining the average hours per task, I researched our existing tickets with remediation paths for the last two years. I also directly observed multiple projects as they progressed through the early define and design periods. Each represented hour is billable, and the cost varies for the seniority of the contributor. I also calculated the amount of client time as both project stakeholders and validators. My findings are an average of that research. Additional factors such as client needs, complexity of systems, complexity of the design, documentation fidelity, and more can swing these numbers higher or lower.

Being in quality assurance provides me with a “train caboose” perspective on projects. I’m able to problem-diagnose at the meta-level and provide feedback and knowledge-gap understanding. One of my personal favorites is detangling dry WCAG standards into forward momentum from the team. To the beginner, WCAG standards were hard to consume and comprehend, as they’re written in a dry manner. For a long time, the connection of the standards as compared to a particular person’s job duties felt disconnected. But I feel like accountability and understanding from all project disciplines are changing. 

Prioritizing Equal WCAG Standards 

WCAG standards are equal. However, when prioritizing for an Agile release structure, there may be a severity consideration due to visibility or functionality of the product. If an end-user cannot execute a high priority feature—regardless of input method or accessibility need—it is considered a blocker.

For example, if a keyboard end user cannot use the navigation, this is considered a highest-priority level defect. If a hover state color does not match color contrast by a slight ratio difference, that is a lower priority defect.

This approach is aligned with a pre-determined “Definition of Defects,” which cites accessibility as both a business requirement and user interaction mode. Between the Definition of Defects and the business requirement that a site is accessible to WCAG 2.1 AA standards, there is a set mechanism to prioritize defects based on their severity. We include this Definition of Defects into our SOWs. I recommend all teams create a Definition of Defects document, as it gives a metric to determine the next steps per ticket and per Sprint.

QA Engineering: Driving the Change

Our QA team at C&T is hired “to think of thinky things.” Each member is trained to give useful feedback in a neutral technical writing format. We are all capable of stepping into multiple user scenarios at once. The QA team provides meta trends while detailing granular and actionable feedback in-line to the overall health and other needs of the project. We consume the project, and we have to detangle all the non-functional requirements and determinations made prior to the beginning test. We’ve seen the result of everyone’s input to a project, and how it has all come together.

From an accessibility POV, the expectations I have for my team are to:

  • Expect a large amount of testing to be manual in nature. Automation toolsets have coverage holes and need to be complemented with a skilled and trained manual QA Engineer.
  • Get into the trenches with development teams—particularly front-end development. Work alongside the team in a supportive and humble manner.
  • Work with the legal team to determine contractual defaults and to identify yourself as a “helpful stakeholder.” If you’re working on an internal/non-contractual project, consider researching potential liability issues and help draft language or business requirements to provide the compass.
  • Pair with team members outside of QA (interaction designers, strategists, project management) to first detangle them from the mystique of accessibility, and then turn them into our largest allies and collaborators.

Additionally, I always recommend adding internal accessibility training opportunities—both synchronous and asynchronous—to my team’s work. We produce training for departmental meetings, we reach out to other departments through Lunch and Learn workshops, and we make an Accessibility Slack channel available to all C&T employees. Advocating at a per-discipline understanding allows teams to translate accessibility business requirements into their methodology and deliverables. We plan for a long ball game; we are continuously working on accessibility initiatives. 

What C&T is Doing to Evolve Ourselves

Working on initiatives like accessibility requires an adaptable roadmap and a communicative feedback-oriented nature. These high-level initiatives are what drive change forward for any deliverable specialization. However, accessibility shifting left has special considerations for its adoption. Here are my primary high-level pushes that are then tasked at the daily level:

  • Pair with designers and strategists early to discuss an accessibility business requirement while keeping their design vision creative and intact.
  • Determine team-centric toolsets that are scalable and can be easily incorporated for maximum efficacy and adoption. Researching and employing a “Petri dish” trial provides gradual improvement on either toolsets or process.
  • Validation and remediation of accessibility defects in-line to sprints. Avoidance of doing “all the accessibility tickets at the end.” Prioritize accessibility-related defects in the same manner as other functionality/usability defects.
  • Evaluating estimation versus time burned as best as possible during the evolution of a project. This provides valuable estimations with project nuance to project management teams, and allows you to understand accessibility velocity within the development teams.
  • Massive training, mentoring and scaling initiatives within the team. Internal training and mentorship program allows for a specialist QA Engineer to develop a team member’s capabilities with practical application onto projects.
  • Avoiding being the single point of contact for accessibility throughout the organization (unless it is your job). This is not sustainable and hinders individual team member growth. Build up your team members to scale this important aspect.

You can move teams forward at a grassroots level, regardless of being in a test role or otherwise. Here are my recommendations to help build awareness on your team:

  • When discussing accessibility adoption on your own teams, devise how this can be operationalized and scaled from the start. Standardized team toolsets, as well as a default ticket template, help our team enormously. 
  • Template-ify everything you possibly can! I have a sample set of RFP responses that evolve over time. It’s easier to riff from a default baseline level.
  • When pushing for accessibility through other teams, determine primary motivators for each individual and how your skillset can become indispensable to their own deliverable needs.
  • Determine how your organization will “sell” this to clients. How is it useful to their specific bottom lines? Is this a value-add that makes your organization special? 
  • Push for scientific method principles in an iterative environment. Always look for incremental improvement project-over-project or deliverable-over-deliverable. Ensure that it is measurable. 
  • Consider hiring people with disabilities to produce first-person context and narrative.

Future-Proofing and What’s on the Radar

As we look on the horizon for accessibility requirements, we see a need to constantly adjust our tactics and deliverables. Right now, we are simplifying internal test toolsets and continuing toolset evaluations. We are currently working with JAWS Inspect and axe Pro in the long-term evaluation and potential adoption. We are researching intersectional test specializations, security x accessibility, data analytics x accessibility, and automation testing x accessibility.

Development teams are continuing to evaluate and incorporate Pa11y into our continuous integration framework, and are adding an accessibility checker into Storybook. I’ve also researched Integrated Development Environment (IDE) integrations for accessibility, providing real-time recommendations on first-party code.

Outside of development, we are opening dialogues with existing client accessibility teams during very early stages of scoping and strategic definition. We are devising cooperative plans with other client vendors to produce end-to-end accessible integrated deliverables. Finally, we are consulting with clients to detangle and operationalize accessibility in order to sustain an internal practice of their own.

I would challenge folks reading this article to determine their own organization’s next steps on accessibility. It’s important to measure and provide context to stakeholders that make sense to their priorities and job duties. In the future, we expect to see continued accessibility considerations from the earliest moment. We’re also researching how to create accessible creative technology experiences, incorporating inclusivity into augmented reality or hardware/software builds. We’re also continuing our internal education and scaling accessibility initiatives – an important constant. Code and Theory is pushing forward with cutting edge design and development concepts that are inclusive and functional for our clients as the new decade commences.

Sara Taboe\r

About Sara Tabor

Sara Tabor is a Director of Quality Assurance for Code and Theory. Sara has worked in the technology field for the last twenty years, concentrating on website and application development quality assurance for the last twelve years. Sara is skilled in process management, improvement strategy, and communicating detailed feedback to help developers become more agile in their craft. She is passionate about working with quality assurance engineers to foster valuable career paths, mentoring each to become creative problem-finders and specialists within the field. Sara also builds end-to-end awareness and iterative feedback for accessibility needs within Code and Theory. She is well-versed in managing multi-timezone test teams in an Agile Follow the Sun method.