Accessibility in iOS

A few weeks ago, we received an email from a blind iOS user. He wanted to use one of our apps, Alchemy Synth Mobile Studio, which is a powerful synthesizer for iOS. He was wondering if we could make Alchemy more accessible. Until then, we did not do this for any of our apps. We all knew that iOS has some built-in support for accessibility, but none of us knew exactly how this worked. So we did some research and with the help of the user who emailed us, we were able to make the main features of Alchemy accessible in only a couple of hours. We'll present our findings in this post.

What is accessibility?

In iOS, accessibility features help people with disabilities to use their iPhone or iPad. There are three forms of accessibility features: hearing, physical & motor skills and vision. In this blogpost we will talk about accessibility for visually impaired users. There are several aids to help visually impaired users. It is possible to enable VoiceOver or zooming, or to invert the colors (Settings app > General > Accessibility). It's even possible to control an iOS device using a braille display. Depending on the level of visual function, users can find their own combination of settings which best suits their needs. Most of these aids will work automatically, but in order to get VoiceOver to work properly, you need to add some extra information to the UI elements in your app.


When VoiceOver is enabled, every user interface element should be selected before it can be used. The user can inspect the UI by dragging over the screen, selecting views and controls. When a new element is touched, its name is spoken out loud. Once an element is selected, double tapping will fire the default action for that element.

Why would I make my app accessible?

Well, why wouldn't you? Apple seems to have put a lot of effort in integrating accessibility into their mobile operating system. Developers often think it takes a lot of time to support this, while there are almost no users profiting from it. I was thinking this as well, but implementing accessibility in Alchemy only took me eight hours, which isn't much if you take into account the complexibility of the app and the research I had to do in that time. Regarding the users profiting from accessibility: there is an online community of visually impaired Apple users . Let's be honest, I don't know how big this community is, but if spending a little extra effort could help them using your app, I would say: let's do it!

OK, but how?

As I mentioned before, Apple has done the hard part for you. If you enable VoiceOver (and get used to the new gestures), all native UI elements will be selectable. If you tap such an element, its content is spoken out loud. By default, this will be the title (for buttons), text (for labels and textfield), filename (for images) or value (for sliders). For all elements, you can manually specify a label and a hint, which are spoken when an element is selected. If the element has a value (like a slider or a switch), this value is spoken as well.

It's getting more interesting when you want to add accessibility features to custom UI elements. You still need to set the accessibility label and hint as before. However, if your UI element can take different values, you also need to set the accessibility value and update it when your UI element's value changes. Also, you need to set the UIAccessibilityTraits. Accessibility traits describe the custom element to VoiceOver so VoiceOver knows how to treat them. There are traits describing what an element is (UIAccessibilityTraitButton, UIAccessibilityTraitImage) and traits describing a behavior (UIAccessibilityTraitPlaysSound, UIAccessibilityTraitUpdatesFrequently). Check the Accessibility Programming Guide for more technical details.

In Alchemy we created a custom control for changing the volume. It looks like a button, but you can drag left or right to decrease or increase its value. Screenshot of Alchemy with the volume button highlighed

Since this is not default behaviour for buttons, we need to specify an extra accessibility trait called UIAccessibilityTraitAdjustable. When using this trait, you need to implement two methods in your subclass:

- (void)accessibilityDecrement {
    [track setVolume:[track volume] - 0.05f];
- (void)accessibilityIncrement {
    [track setVolume:[track volume] + 0.05f];

Now, when VoiceOver is enabled and the volume button is selected, the user can swipe up or down to increase or decrease its value.

Wrap up

If you're interested in making your app accessible, you'll see that most of your UI already is. You could start by checking your native iOS elements and see if they need accessibility labels and hints to clarify their function. Label your images and if an image is used as a button, describe what it does. Then, for custom UI elements

  1. Set the accessibility label and hint
  2. Set the accessibility value when its value is changed
  3. Set the relevant traits
  4. Override the necessary subclasses

By paying attention to accessibility from the start, you won't find yourself traversing all elements top down to add accessibility features just before the release date. Make it part of your programming routine and it will hardly cost you any time, but it will certainly increase your user base.

comments powered by Disqus