Awesome Automation Strategies for Accessibility Testing with Crystal Preston-Watson

By Test Guild
  • Share:
Join the Guild for FREE
Crystal Preston-Watson TestGuild Automation Feature

About This Episode:

Accessibility is more than just a checkbox; it's become a civil right and a human need. Today, we'll explore how proper automation can elevate your accessibility testing to the next level.

Check out our sponsors accessibility automation tool now: https://testguild.me/astack

In today's episode, “Automation and Accessibility: Mutual, Not Exclusive,” Crystal Preston-Watson, a senior digital accessibility analyst at Salesforce, joins us.

This session was taken from our Automation Guild 2022 online event, where Crystal took us on an insightful journey, addressing higher-level questions and critical issues related to accessibility testing, specifically in automation.

She also covers the essential considerations teams must consider before diving into accessibility automation.

Learn about the different types of accessibility testing—manual, automation, and user acceptance—and discover why automated tools alone aren't enough to ensure proper accessibility.

Listen in to discover the importance of experience with accessibility, the need for company-wide support and capacity, and best practices for combining manual and automated testing.

By the way, to get on the waitlist for the lowest prices on tickets for the next Automation Guild event in February, go to automationguild.com now.

And speaking of accessibility testing, I wanted to share an automated accessibility tool from this episode's sponsor, Browserstack.

Exclusive Sponsor

BrowserStack Accessibility Testing

Are you a developer or QA professional struggling with web accessibility testing? Our sponsor, BrowserStack, has the solution you've been waiting for!

BrowserStack Accessibility Testing stands out as an all-in-one platform that simplifies the process of making your websites inclusive and compliant. Its unique features make it a game-changer in the field of web accessibility testing.

Workflow Scanner: Identify basic and complex accessibility issues at lightning speed. Scan entire user workflows in one go, catching problems like missing alt text or poor color contrast without multiple scans.
Assisted Tests: Tackle advanced accessibility concerns with auto-generated questions that guide you through the process.
Real Device Testing: Get instant access to real screen readers on actual devices, including VoiceOver, NVDA, and TalkBack. Experience your site as users with disabilities do.
Compliance Monitoring: Schedule regular scans and access comprehensive reports in a central dashboard to stay ahead of ADA and WCAG requirements.

By choosing BrowserStack Accessibility Testing, you're not just fulfilling requirements. You're creating truly inclusive websites, enhancing SEO, improving user usability, and broadening your audience reach.

Ready to make your web applications more accessible? Visit https://testguild.me/astack to learn more and start your journey toward digital inclusivity today!

About Crystal Preston-Watson

Crystal Preston Watson

Crystal Preston-Watson is dedicated to creating innovative, dynamic, and accessible applications for everyone. She brings a passion for quality, creativity, and the openness of technology to every opportunity she undertakes.

Connect with Crystal Preston-Watson

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:00] In a land of testers, far and wide they journeyed. Seeking answers, seeking skills, seeking a better way. Through the hills they wandered, through treacherous terrain. But then they heard a tale, a podcast they had to obey. Oh, the Test Guild Automation Testing podcast. Guiding testers with automation awesomeness. From ancient realms to modern days, they lead the way. Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold. Oh, the Test Guild Automation Testing podcast. Guiding testers with automation awesomeness. From ancient realms to modern days, they lead the way. Oh, the Test Guild Automation Testing podcast. Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold.

[00:00:34] Joe Colantonio Accessibility is more than just a checkbox. It's become a civil right and a human need. And today, we'll explore how proper automation can elevate your accessibility testing and take it to a whole another level. In today's episode, Crystal Preston-Watson, a senior digital accessibility analyst, a Salesforce joins us, and this session was taken from our Automation Guild 2022 online event, where Crystal took us on an insightful journey addressing high level questions and critical issues related to accessibility testing, specifically in automation. To learn more about different types of accessibility testing, you definitely want to listen in. And speaking of accessibility testing, I also want to share an automated accessibility tool from this episode sponsor, BrowserStack. As you know, and as we go over in this session, accessibility testing is huge nowadays to help include all your users of your software. Accessibility testing is also becoming more and more important to help you comply with all the new legal standards that are coming up, like the Americans with Disabilities Act or the Web Content Accessibility Guidelines, which helps reduce your risk of lawsuits for your software. However, ensuring your website is accessible can be a tough task. Juggling multiple tools for accessibility testing is time consuming, inefficient, and unscalable, often requiring QA or Devs to rerun automated scans across various elements and states just to indentify any potential issues. And having to collect all this data from all these tests is very cumbersome and getting access to devices with screen readers enabled can also be a huge burden. That's why I want to introduce Browser Stack's accessibility testing. With BrowserStack, you can seamlessly test, report, and monitor the accessibility health of your web applications, all from a single platform, identifying both basic and complex accessibility issues at lightning speed. For example, they have a workflow scanner that you can use to scan user workflows in one go, pinpointing issues like missing alt text or insufficient color contrast without the hassle of multiple scans. And for more intricate problems, their assisted tests guide you do identifying advanced accessibility concerns with simple auto generated questions. Plus, with instant access to real screen readers on real devices like VoiceOver, NVDA, and Talkback, you can experience your site just as if your users with disabilities do, ensuring an exclusive digital experience. BrowserStack's Accessibility testing empowers you to create a website that is not only compliant, but also welcomes and delights all your users. So head on over to Testguild.me/instack for more details, or click on the link down below to learn more.

[00:03:24] Crystal Preston-Watson Hello everyone, and welcome to Automation and Accessibility Mutual Not exclusive. I'm Crystal Preston-Watson, senior digital accessibility analyst at Salesforce. I want to tell you some of the things you won't find in this session. This is not an accessibility 101 and a foundation in accessibility is necessary for accessibility testing. But today, I focus on a higher-level questions and issues teams need to consider before beginning accessibility automation. That also means I won't be coding or demoing tools. However, I will give you links to items and resources that I discussed today. If you feel that you don't have a solid understanding of the basics of digital accessibility, I encourage you to come back to this video after doing a bit more learning. I'll be here waiting. Well not me. This is the digital version of me. I don't live in this video. Let's get started by going over the 3 types of accessibility testing. Manual accessibility testing is when teams test applications manually for accessibility issues that may cause problems for users with disabilities. This testing is conducted using browser and plug in tools like Wave, Lighthouse, or Axe. Also, assistive technology like screen readers and switches. Currently, on the screen, I have a slide that has several logos for automated tools like the ones I just mentioned briefly, and screen readers, Jaws and NVDA. Also, an image of two jelly beans switch devices? Automation Accessibility testing is my focus today. You might be wondering why I listed some semi-automated tools under manual instead of automation? Well, while many of those tools can be fully automated, their browser extensions are also widely utilized in manual testing. This slide showcases three logos, Axe, Applitools, Espresso. DQ's open-source Axe core library powers a considerable portion of free and paid accessibility tools and automation frameworks available currently. Applitools's contrast advisor validates minimum color contrast ratios for text and graphics, and can be integrated into existing automation frameworks. Espresso is an Android UI testing framework that has integrated accessibility checks. User or user acceptance testing gets overlooked by companies when it comes to accessibility. Testing with people with disabilities is necessary to understand how your application or site performs in the real world, and finds issues and bugs beyond accessibility conformance. This testing needs to be done with more than one person or group with a singular disability within your target audience. Also, make sure that its paid. People's time is worth money, and given that this testing can find very costly bugs, it's worth more than free software. Part one, are you ready to automate? I regularly ask those who tell me that their testing team wants to automate their accessibility testing, if they are ready. I don't ask this expecting an answer, but to get them to reflect on their current accessibility initiative. There's a vast difference between wanting to automate and being ready and able to automate accessibility testing. Teams tend to focus on technical readiness, but with accessibility, two significant considerations must be addressed before a framework is selected or code written to make sure automation is successful. Let's take a look at those now. Experience with accessibility, for many testing teams, accessibility is not an intentional undertaking, as my grandfather like to say, they fall as backward into it. A demand by company leadership in response to customers concerns, complaints, or even ADA lawsuits. There's scrambling to inventory accessibility in the product. There's no time for preparation or ramp up for training or education. The issue compounds with the realization that accessibility has a needed provenance of manual testing. There's pressure to automate ASAP to make testing quicker or more effective. You can't see me on screen, but I did make air quotes when saying effective. Don't get me wrong, I am on board with test automation. When done right, it increases productivity, quality, and requires a smaller time investment. But the magic words are when done right. That doesn't happen with teams that do not have a solid foundation in digital accessibility, that struggle to interpret and test against the Web Content Accessibility Guidelines success criteria, let alone understand how disabled people use assistive technology devices to navigate their products. In a presentation called After the Audit, I suggest actions that teams can take to develop accessibility skills and experience to integrate into their testing processes. Here are a few of those actions that your team can undertake. One, establish a baseline of knowledge. It's essential to create a shared language and understanding when discussing accessibility and disability. Two, communicate why and how to go through testing. Encouraging teams to learn through courses, attend conferences, and providing team training on core fundamentals. Two, Play to people's strengths. That is finding people for the tasks that fit their experience of skill. If someone is apprehensive about testing using screen readers, well, you want them to have the skill. They can work on keyboard testing and teaching that to others while they skill up with other assistive technology. Three, shifting accessibility to the left. It's not unusual for testing teams to become the sole champion of accessibility within the company. Four, accessibility to be a genuine priority and done with quality. It has to be the concern of the whole organization. This means promoting and informing stakeholders and leadership that accessibility cannot be tested away and needs to be a companywide priority. This last action leads me to the second consideration for being automation ready, capacity and support. As I previously mentioned, testing often becomes the sole champion of accessibility within a company. Unfortunately, it is considered an afterthought or a nice to have instead of a necessity. This is reminiscent of some opinions about queuing testing, especially when it comes to manual and exploratory testing. I was a QA for nine years, and I'm all too familiar with the siloing and devaluing of testing in QA. It's not shocking to see both view the same, and why testing teams become the de facto order of accessibility. But given the individuals, situations, and circumstances that are affected by inaccessibility in applications and sites, this is not something that can be left up to TechDebt or a one off crunch sprint. There are three items you should evaluate to see if you have the capacity and support to begin automation. Everyone is on board, on board, and taking up appropriate roles when it comes to accessibility. This is why it needs to be a company wide priority with backing and training and education for everyone. If the Company Accessibility Initiative boils down to testing, it's a failed initiative and no automation will make up for that. Ongoing manual testing, I'll get into the particulars later, but manual testing cannot be automated away from accessibility. Automation and manual here are generally mutual. If you have the headcount for both, that's great, but if you don't, then manual is the one that needs to continue. Keeping regression in mind. Regression and accessibility testing can be problematic, mainly when bugs and issues are found due to audits, as third party vendors usually conduct them. It's important to make sure bugs fixed due to audits or other avenues make it into your regression testing. It's not unusual to find the same bugs with every new audit, because regression test cases never make it into the in-house testing team's regression suites. If you aren't frozen in existential dread. After considering what goes into being automation ready, congratulation, you probably are ready. But don't break out that code just yet. Part two, what should you and should not automate? A point that needs to be driven home regarding accessibility automation is that they will not find everything. Most accessibility experts agree that between 20% and 40% of accessibility issues are found with automation testing. Combining manual and automation is a balanced and holistic approach to accessibility testing. Teams might be reluctant to add manual testing for accessibility, especially if they are moving to automation in other areas, but manual testing is unavoidable for catching accessibility issues that automation cannot detect. Let's look at areas where you can enhance your accessibility testing with automation and the best places for manual. Unit and integration, unit integration tests are perfect for accessibility automation. This is where you can detect low lying bugs that take a vital time during manual testing, and watch for regression. This also allows you to incorporate accessibility into your continuous integration builts. Some of the things you would want to focus on in unit integration would be. And I want to credit Marcy Sutton for this less work because she's where I got this from. Unit component specific behavior interaction and focus APIs, Aria states. Integration, real world browser testing, document and page level rules, and color contrast. UI, I don't usually encourage UI automation for accessibility. UI tests are known to be flaky due to frequent adjustments and changes. My personal experience and opinion that time can be spent creating unit integration test or best for a person conducting manually. But if you want to have UI automation, these are the scenarios that tend to work best. Multi-page workflows and form flows. Manual, and automation framework cannot make subjective interpretations whether something is or isn't accessible. Also, when it comes to assistive technology devices and software, it's either impossible or not practical to automate their operations. The heart of accessibility is the human experience. This is why manual testing is priceless and a necessity for accessibility. Here are some of the areas that you would want to focus on. Keyboard interactions, Assistive technology Device, Captions, Transcript, all text quality and error identification. Part three, are you ready to automate? At the beginning of this video, I said that there was a difference between wanting to automate and being ready to automate your accessibility testing. I ask you, given all that I've discussed, your team's experience with accessibility, your capacity, support from within your company, are you ready to automate accessibility? Again, I'm not expecting an answer here. This is the digital version of me. I want to thank the Automation Guild for giving me the opportunity to speak to you today. And if you take anything away from this video, let it be this, accessibility is more than design, code, and testing. It's about humans and their experiences. Accessibility is also a human and civil right. If you have any questions about information I presented today or you would like links to some of the resources and items that I've talked about. You can reach me via Twitter @scopicengineer, LinkedIn, just type in Crystal Preston-Watson and my website, CrystalPrestonwatson.com and I will see you next time. The real version me, the digital version one of us. We'll see you again, hopefully.

[00:16:00] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a515. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:16:34] Hey, thank you for tuning in. It's incredible to connect with close to 400,000 followers across all our platforms and over 40,000 email subscribers who are at the forefront of automation, testing, and DevOps. If you haven't yet, join our vibrant community at TestGuild.com where you become part of our elite circle driving innovation, software testing, and automation. And if you're a tool provider or have a service looking to empower our guild with solutions that elevate skills and tackle real world challenges, we're excited to collaborate. Visit TestGuild.info to explore how we can create transformative experiences together. Let's push the boundaries of what we can achieve.

[00:17:19] Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
A person is speaking into a microphone on the "TestGuild News Show" with topics including weekly DevOps, automation, performance, and security testing. "Breaking News" is highlighted at the bottom.

SimpleQA, Playwright in DevOps, Testing too big? TGNS140

Posted on 11/04/2024

About This Episode: Are your tests too big? How can you use AI-powered ...

Mudit Singh TestGuild Automation Feature

AI as Your Testing Assistant with Mudit Singh

Posted on 11/03/2024

About This Episode: In this episode, we explore the future of automation, where ...

Eli Farhood TestGuild DevOps Toolchain

The Emerging Threats of AI with Eli Farhood

Posted on 10/30/2024

About this DevOps Toolchain Episode: Today, you're in for a treat with Eli ...