How we think AI can help with accessible design

Disability, in very simple terms, can be defined as the barriers that someone with an impairment faces while trying to participate in the world around them. As designers, we can design things without those barriers built in. So, in practical terms, that might mean checking that colour contrasts are sufficient for those with low vision.

However, sometimes ensuring that we have removed those barriers from our designs can be a lot of additional work, or require complex and specialist knowledge. This is where we can get AI to help us bridge the disability divide, as a tool within our design process. 

There are already some tools out there that can help you, some use AI, and some do not (not everything marketed as using AI is really using AI!). But what they have in common is that they can make accessible design easier, faster, more accurate, and err well,  more accessible. 

The tools we use now

Plugins for assessing colour contrast 

These are really handy to have running as we design, or intermittently run. This helps us to make sure we aren’t going to suggest any low contrast combinations right from the concepts stage. We particularly like ones that check against the more nuanced and accurate APCA method. Text contrast checker, for instance, on Figma. 

Online colour contrast checkers 

These can be really useful if you aren’t a designer using design software like the Adobe Suite or Figma.  There are many to choose from, some check a couple of colours at a time, others will give you the ratio readouts. We built one that could do it all; you can have a go with it here.

Chat GPT to help analyse documents or images

There are some more complex tasks that you can ask AI to help you with. For example, you can get it to analyse a Word document, and tell you which tags you should use for each bit of text so that you can create an accessible PDF.

You can also ask it to analyse an image, estimating rough typesizes and colour contrast ratios, leaving you with a reading at the end that can tell you if an image (perhaps a social media post) meets WCAG accessibility standards. 

See your designs from the perspective of someone else

Stark is a great tool that allows you to simulate conditions like various forms of colour vision deficiency. This allows you to check colour contrast in a new way. And really helps to bring to life the barriers others may be experiencing with our design. 

Hemingway App 

This tool helps with making your writing, and therefore your content, more accessible. It can read and analyse the complexity of our writing, and let us know what grade level it would fall under. This helps us to ensure that we aren’t excluding people through the copy and content that we write.

They now have an AI-powered version of the app, which will help suggest adjustments for you, simplifying the editing process further. 

The tools we want to see in the future

At the moment, brand accessibility (what we call all the elements of accessibility that a brand might want to consider, not just design but also content writing, and technical set-up of documents) can be quite a bit of process. With tools here and there to help you with everything. 

What we would love to see is a single tool that has been trained across lots of different accessibility principles. That could take an image, or PDF or a PowerPoint presentation and analyse it for accessible design, technical set-up (particularly useful for PDFs) and writing/language used.  The tool could learn their brand style over time, or have their brand guidelines uploaded at startup. So that any suggestions could remain on brand. 

Assessing the accessibility of one typeface over another. Choosing an accessible typeface includes a lot of elements that need to be considered. X-height, how open the counters are and character differences when reversed. It’s not just a case of serif or sans serif. This kind of learning is something that AI could be trained to look out for. Allowing you to create a tool that could suggest similar but more accessible typefaces. 

An area of current research is AI changing the design of something based on biometric data. AI can analyse your needs, everything from your heart rate, or your energy levels, how much sleep you have had, or whether or not you recently had a flare-up of symptoms. Using that data could change the way a website appears on your phone or other connected device. It could make a typeface larger, or swap to a more dyslexia-friendly one. There is much more research needed in this area, and there are issues of personal privacy that might put potential users off. But we think it’s a really interesting area that could help with the fluctuating nature of disability and the accessible tweaks that that demands. 

There is huge potential for AI to help brands become more accessible and close the disability gap. Have you got any ideas for what tools you would like to see developed? We’d love to hear about them. 

further reading