iOS Accessibility and VoiceOver: A pair of glasses for those who don't see
As iOS devices become more popular nowadays, we have to deal each time even more with features to include more groups of people who cannot deal with apps the same way as us. Something I consider one of the most important elements of an app is its capability to be usable by any kind of user. We need to know that not every user has the same way percept the world: some people cannot hear sounds very well(or can't even hear anything), some users cannot hold devices in their hands and there are users that cannot see anything on the screen.
For these groups of people I described, any effort to build a traditional User Interface and User Experience would be in vain, because they will never know how the UI works. So we developer must figure another way to give them the same experience and usability when they interact with our iOS applications, but how?
Fortunately, Apple provides us a great framework which help us to build interfaces in a way that people can interact with our app and use all the available features in a way they "can see". They are called "accessibility shortcuts", which are made for multiple kinds of disability, but in this tutorial we are aiming on blind users. I may write other articles about other kinds of accessibility soon.
Most companies don't give all the attention accessibility really needs..but I tell you, if you don't apply accessibility in your application, you can lose millions of possible new users as long as 285 million people in the world has impaired vision and 39 million people are totally blind. You must also think about the difference you can make in someone's life!
VoiceOver: The abstract interface for blind users
Apple provides a great mechanism for blind user to navigate through the iOS and Mac OS systems, which is called VoiceOver. VoiceOver is a totally voice-oriented interface which describes each UI element as the user swipes his finger across the screen, this way listing the next widget, telling what is it, what is its meaning and possible hints about what should he do in the current context. It also alerts the user when some layout change occurs.
Imagine the following abstract screen on an iPhone:
As I said, all the accessible UI elements are read to the user one per time, from left to right and from top to bottom, so, if the VoiceOver shortcut is activated, the first element to be read would be the title, followed by the plus button in the right side of the navigation bar, then the button and the label, and so goes..
The reading order is described in the image bellow by the green path:
If you want to take a look on your physical device, go to Settings > Accessibility > VoiceOver and tao to activate. You can all configure the speech velocity of Voice over and change its voice. Also, you are able to define a shortcut to Voiceover by tapping the power button thrice quickly.
Now take a look on the iOS system by using the VoiceOver feature. Pay attention to the home screen listing each app/folder in the order as said before.
Brilliant. Now we are going to build your first accessible app which displays a list of notes in the home screen. We will take a look on each accessibility property of a UI element and how it is applied to the VoiceOver speech.
App: Accessible Notes
First things first. Open your Xcode and create a new Single View Application project called "AccessibilityDemo". The first thing you have to do is embed the UIViewController you see on the storyboard inside a UINavigationController. Go to editor and select "embed in" , then select Navigation Controller. Now, you have a a navigation controlling holding the default view controller as its root.
Now, please insert a UIBarButtonItem as the right button of its navigation bar and a UITableView constrained to occupy the entire screen.
Great, now you must take the plus button and the table view to create IBOutlets on your ViewController.swift file:
Next we will create a simple model for a note as a struct. Create a file called NoteModel.swift and copy the following data structure:
Next, create a Cocoa Touch file with a Xib file for a UITableViewCell and call it NoteTableViewCell. Edit the xib adding a side orange view to represent a favorite view. Insert two labels, one small and one large, representing respectively title and subtitle of a note. Next to the labels, add a button and change its text to "Select". The xib must be as following:
Drag all the IBOutlets to the swift file:
Now, declare a variable called "noteModel" representing an optional instance of a note. Then, create a method called setup, which receives a note model. It will set the title and text of the labels respectively with the note model properties. The side view will be changed to yellow if the note is set as favorite. If not, it will be white.
But we haven't explained the most interesting part..the accessibility! Ok, we created a table cell and implemented a mechanism to take a model and fill the UI for an user who can look at the interface with his eyes, but how would a blind user "see" what we built? We will explain it now.
Apple provides us a way to apply accessibility properties to UI elements, but how can we decide how they are described to the user? How can we imply that a label will have its own text as accessibility description?
Well, the first thing a UI element must have to be read to the user is to be recognised as an accessible element. There is a boolean property every descendant of UIView has that verifies it as accessible. This is the "isAccessibilityElement" property. If it is set true, so it will be listed by Voice over when swiping the elements.
The other properties we shall mention are:
- accessibilityLabel(String?): You must assign here what you want to talk about the element. For example, if it's a label, you would insert the accessibilityLabel as the text of the label, the same for a button. If it is another UI element, try to use an accessibility label that resumes what this element means, like "sound progress bar".
- accessibilityTraits(UIAccessibilityTraits): This enumeration property represents which kind of element is that, if it is a text, a button, a header, a keyboard, or other type. This is very important because it indirectly tells the user how he is supposed to interact with it. The element can hold more the one accessibility trait
- accessibilityHint(String?): This property is not important most of time, because what it really does is to give a simple hint about some particular property or feature of that element. For example: Clicking this button erases your progress at the game..
- accessibilityIdentifier(String?): This property is not read by VoiceOver, actually, in practice, it only works as an identifier for UI Testing.
Now that we described which are all the accessibility properties of a UIView, allow me to illustrate how VoiceOver decides what to say when the user swipes to an element:
When the element if focused, the speech order is: accessibilityLabel, accessibilityTraits, accessibilityHint. Remember that the UIView instance must be set to be an accessibility element.
If we are in a real application and the button above is focused, VoiceOver would say: "A random button button Click to do something"
Applying accessibility to our app
Now that we learned how to apply accessibility to UI elements, we can insert a piece of accessibility to our interface. Go back to the project.
There are two ways that we can apply the accessibility properties to our UI: by interface builder or programatically. If you are used to storyboards and xibs, go to the main storyboard at the identity inspector. You will see an accessibility section at the very bottom where you can set the accessibility properties I described.
But if you are like me who prefers less images and more code, feel free to follow me assigning everything by text.
Go to the ViewController file and let's add accessibility to the UI elements. The accessibility of the title is already set by default, but we did not customise the add button. Please, create a new method called "applyAccessibility" and copy as bellow:
And now, call this method at the "viewDidLoad" method.
Now, we must create an array of note models to fill the table view data source:
We haven't created the protocols for the table view yet, so create an extension for the NotesListViewController and place the following:
Register the table view cell:
And fill the "viewDidLoad" method:
Now that our view controller is complete, let's put some accessibility on the cells, but it brings a question: we really need to make all the labels, buttons and subviews of our cells accessible? The blind user must swipe through all the subelements of a cell? The answer ir no. The user does not need to do that. Even without caring about a visible UI, we must be careful about the experience the blind people will take when using our app. It would be very annoying to navigate through a lot of elements that represent parts of a same view. What we need to do in those cases is to resume every label and view into a single accessibility view which tells all the information about it.
Drag a UIView and put it on the table view cell xib file. Then, make all the other elements, except by the select button inside the auxiliary view. Then, drag another one and place the select button inside. Great, now our cell has two separated accessible subviews, one with the information about its content, and the other to hold the select button that exists. Basically what happens is that we have two accessibility components for our cell.
Don't forget to drag the IBOutlets of the two accessibility views to the cell file.
Let's implement a new method for our accessibility:
What is happening is that we are resuming all the information of the note model at the noteInfoView, and when the user swipes to the cell, it gives the information about its data. The next element will be the select button view where the uses is informed about the button, and can click it. Pay attention at the first line, where the entire cell is set to not be accessible since we want the inner components to be accessible. Don't forget to call this method at the setup.
Brilliant, now we have a completely accessible application showing a list of notes. Run and you shall see the interface bellow:
If you are running on a physical device, you may note that the accessibility works as we expected. But if you don't have access to a physical device, don't worry. MacOs provides us a great tool to see the accessibility of an Xcode project, open the accessibility inspector through your command line.
The accessibility inspector allows you to simulate the VoiceOver on the Xcode simulator. On the top right selector, choose the Xcode simulator option, and then you can use it in our project when running, but click on the target button at the top to enable it.
Now you can look at each element on the screen and see its value, label, traits, identifier and the VoiceOver description at "Quicklook". Now we have a completely accessible application.
Bonus: Accessibility Notifications
We didn't use this feature in our project, but it is important to talk about how the user is told about possible notifications. Every time when something pops up or a new item is added to the list or any layout change, we must notify the user about what happened.
For example, if a UI alert suddenly appears, we must tell the user:
Basically, we are first verifying if voice over is running, then, if it is, we posy a notification telling that the screen changed(it could be a scrolling, or a simple announcement) with an argument illustrating the change. In the case of an alert, it would be the alert itself.
Make your app accessible is one of the most important tasks when building a new project. You are building a bridge to allow thousands of people to use your app and expanding its reachability. Mobile devices came to help people in a lot of ways in life and you are not doing all the job when you are not considering all kind of users.
I hope I touched the ways you plan your applications, and you have enjoyed!