iOS Native Mobile Accessibility


– [Moderator] The
broadcast is now starting. All attendees are in listen only mode. – [Laura] Hey, everyone. Thanks for joining the
iOS and Native Mobile Accessibility webinar, IBQ Systems. Our presenter, Chris
McMeeking, will be presenting. We’re gonna start a little
bit after 2:00 p.m., so just sit tight while while we wait for everyone to join the webinar. Hey, everyone, Laura
Goslin here from Deque, Event Manager and Marketing Assistant. We’re just waiting a few more minutes. It’s now 2:00 p.m. We’re just gonna wait
for a couple more minutes to make sure that everyone can hop on before we begin the webinar. Hey everyone, it’s now two
minutes after 2:00 p.m., we’re gonna go ahead and and get started. Thanks again for joining this webinar. It’s the iOS Mobile Accessibility webinar presented by our Deque expert on Android and iOS, Chris McMeeking. Just some quick housekeeping things before we get started. If you are in need of captions, they are posted in the chat box on the GoToWebinar control panel. They’re also, there’s
a link to the captions sent out in the original email that had the login
information for the webinar. If you have a question, please
keep that until the end. You can post that also in the question box on the GoToWebinar control panel. Please be sure, though, if you’re referring to a specific slide, please include the slide number with that. With those things out of the way, I’m now gonna pass things over to our presenter, Chris McMeeking. – [Chris] Hello, Laura,
am I coming in clear? – [Laura] Yep, you’re all good. – [Chris] All right. Good afternoon, everybody. My name is Chris McMeeking. As Laura said, I’m a
Senior Software Engineer here at Deque. I own all of our native mobile accessibility
testing products. Also member of the W3C Mobile
Accessibility Task Force, helping define mobile concerns for what WCAG 2.1 is moving forward, and ChrisCM on STackOverflow. If you have any development
text-specific questions about how to accomplish things, if you tag them with a VoiceOver, iOS, Android, TalkBack are all tags that I watch very actively. Guaranteed to see the question, don’t always have time to answer. But that’s a good way to get my attention. With that, let’s begin the webinar. Let’s do a quick overview of the stuff that we’re gonna cover today. This is gonna look very similar if you were in my Android webinar. We’re gonna cover very similar topics, just from an iOS point of view. The big picture is implementing
accessibility is easy. When you’re talking about
interacting with the APIs on iOS, the interfaces and things
that Apple gives you are really straightforward. But getting it right
is the difficult part. Users, when you’re
talking about a blind user or a physically disabled user and the difficulties that they have interacting with these
assistive technologies, that they’re not often the difficulties that developers have when
using assistive technologies because we don’t use them,
we’re not familiar with them. So knowing what to do is difficult, and in particular knowing what not to do. The frustrations a developer has using a VoiceOver in an application are going to be different
than the frustrations someone unfamiliar with VoiceOver
has using an application. So we’re thinking of the
user in this context. You gotta remember that
these users are very familiar with the assistive
technology and how they work. So don’t project your
difficulties with VoiceOver onto a blind person’s
difficulties using VoiceOver, ’cause they’re often gonna
be very, very different. So knowing what to do and
knowing what not to do is hard. Itinerary, we’re gonna talk
about applying WCAG 2.0 to iOS. WCAG 2.0, they’ve
guidelines, there are things, you know, that thing that I just said how projecting a developer’s difficulties with an assistive technology
versus the actual difficulties, WCAG 2.0 can only help you
answer those questions. The iOS accessibility ecosystem, what assistive technologies are available, what are the public
interfaces that you have to interact with those
technologies, so on and so forth, API demos, I have some staged examples. One of the thing that I’m gonna say is, I usually like to demo
these live with VoiceOver. But it’s very difficult
to do that across the wire in a way that’s understandable. So what I’ve done is, I’ve
marked them up on the slides. You’ll see what I mean
when I get to those demos. Then also, advanced APIs and
demos from Apps in the Wild, I really wanna get to the advanced APIs and the demos from Apps in the Wild. I’ll cover it if we have time. And I can’t cover everything. So if I miss something, ask a question or feel free to reach out with a question to Laura after the webinar. Let’s talk about WCAG 2.0 and how it maps to a mobile environment and
mobile-specific considerations. Screen size, it’s
important to remember that, when you’re dealing with mobile devices, we have to deal with
things on a smaller screen and so create layouts
that are user friendly. And remember that not
everyone is going to be able to understand everything
within the concept of (mumbles) screen size, so
we have to take into account zoom assistive technologies. Zoom assistive technologies are available on desktop as well, but they’re really, really
commonly used on mobile devices, especially dynamic type sizing. Dynamic type sizing basically allows you to control the font, make sure that your layouts and such respond to dynamic type sizing. And focus control. Focus control is often handled poorly, particularly in native
mobile environments. When you’re on a desktop and dealing with a desktop
assistive technology, you’re almost always gonna be
talking about keyboard users when you’re talking about focus control, whereas in iOS, you have this separation of accessibility focus and input focus that happens more frequently. So focus control is really important and understanding the difference between input focus and
accessibility focus. WCAG 2.0 operable,
operable considerations. With mobile operating systems, again input focus and accessibility focus, understanding the difference
between those two things is really important. Gestures, one of the
things that mobile devices allow you to do is, they
allow you to interact directly with the screen. And there is this tendency
to wanna do cool things with gestures, like when
you swipe back and forth, it does cool things and if you do a curlicue
gesture on the screen, it’s some shortcut for some action. When VoiceOver interacts,
or, say, switch co-lers, anyone who’s using an assistive technology isn’t necessarily going to
be able to do those gestures. So we want to make sure that
there’s alternative ways of performing those things. And some things, like device manipulation, sometimes those are things
that are just specific to that, like let’s say a game, where you’re tilting the device and you’re relying on
the gyroscopes to, like, balance a ball on the
middle of the screen. There is no way to supply that
functionality alternatively. So sometimes the device
manipulation is required as the feature of the
thing, and that’s okay. But what you wanna avoid is things like shaking the device that creates some type of action and have that action not
be available elsewhere. Touch target size. Touch target size, things
need a minimum size. Your finger is the clicker and, I mean, I don’t know about you guys,
I have pretty fat fingers even tapping some links in the app store, in Apple’s app store, can get
pretty difficult sometimes. Understandable screen orientation is a really important
consideration on mobile. We have to remember that
users with disabilities are frequently going to have, especially physical disabilities, are going to have devices that are mounted in front of them. And if you have an
application that is required to be in portrait layout and someone has their device
mounted in front of them in landscape layout, that application is
completely inaccessible. So when it comes to the base proto-, unless the orientation of your thing is required for its functionality, you have to support
different orientations, at least 100 degrees of support, although once you’re
supporting 180 degrees, 270, you know, like the
difference between right and left, nobody’s gonna mount
their device upside down. But at least you should
be able to tilt it right, you should be able to tilt it left. And maintaining consistent
layout given different sizes and orientations is important, although ultimately a best practice. Robustness. Robustness is a really important category when it comes to native mobile development ’cause robustness helps you
create maintainable code. I will talk more about this
throughout the presentation and how it applies to specific APIs. But basically what robustness means to me in a native environment is doing things the way the operating system
would want you to do them and embracing the
behavioral characteristics of the operating system. Let’s talk about what I
have on the slide here. Software keyboards can change layout. When you’re talking about keyboard type, it’s important to remember
that there is a difference between a password keyboard
and a normal keyboard. The automatic capitalization of a letter from a normal keyboard is
going to be frustrating to a user who doesn’t know they’ve just automatically
added a capital letter to their password. Conversely, simple like a number pad, and I think getting the keyboard
type right is important. Like I said, support the
characteristics of the platform. A lot of times on StackOverflow when I’m answering questions about this, I’ll see users trying to
do things with the APIs that they are not supported. I will talk about that
specifically for those APIs when I get there. But basically what it comes down to is, if you’re including the information the way the operating system wants you to include the information, it will work well across
devices and device versions. This isn’t as complicated of
a problem as it is on Android because iOS has control of the devices. They sell all the same devices,
they’re all very similar. But we’re still concerned
about different versions of the iOS platform and we still wanna make sure we do things the way iOS users want them done. That way, when things like
the handling of traits change or the handling of labels change, they change universally and change the way an
iOS 7 user wants them versus an iOS 8 user versus
an iOS (audio warps). If you wanna read more about this mapping stuff, it’s available on
www.w3.org/TR/mobile-accessibility-mapping. Lots of great information there. I certainly did not cover it all. I wanna get on to technical details. The iOS accessibility ecosystem. Assistive technologies. Voiceover is gonna be
your primary concern. That is the most widely
used assistive technology, it’s the primary AT for the platform. It’s also the one that’s most supported, it’s the most up to date,
it’s the most stable. But you also have switch control, zoom, dynamic type sizing, and a bunch of other accessibility options like the ability to change colors and all sorts of cool stuff. Alternative input,
keyboards, braille boards. A common question I get regarding
assistive technology is, do I have to test for all of them? The answer is no. You don’t have to test for all of them because a lot of the
APIs are gonna overlap. For example, when you’re talking about keyboard interaction and VoiceOver, those two things you
probably has to test for. But once you’ve tested
keyboard interaction, you’ve probably tested most of the things that a braille board is gonna do. Once you’ve tested switch control, you’ve probably tested a lot of the things that VoiceOver is gonna do. So a lot of these things overlap. Good coverage is gonna be, you gotta make sure your
dynamic type sizing works, you gotta make sure your
switch control works, you gotta make sure VoiceOver works, and you should check one
alternative input mechanism. If you do that, you’re gonna have really, really good, broad coverage. APIs, let’s cover the APIs. They’re simple in what
the properties tell you, but they’re not as
simple as you might think to get them correct, especially the UIAccessibility Protocol, which is things like accessibility labels, hints, so on and so forth. UIAccesibilityContainerProtocol
is more advanced. It allows you to control what things do and how different groupings
of things are presented. Then there’s UIAccessibilityNotifications, which deals with dynamic
elements on screen and so on and so forth. One of the tools we have available is the iOS Accessibility Inspector. I want to do a short demo of that. When we’re talking about
the Accessibility Inspector, it’s important to note
that it’s heavily linked with what you see coming from VoiceOver. So what we have here is, we have this focus rectangle here, right? And as I’m going and
as I’m focusing things, I’m using this within
the context of a browser, but you can also use this
with an iOS simulator or whatever you have in the background. We have this green
focus rectangle show up. If I were using VoiceOver
in an iOS environment, this green rectangle would represent what I was focusing in VoiceOver. And it gives me the title here, like (mumbles), that’s
the accessibility label. Or the type button,
that’s gonna be the trait. It gives me that information. Really important to note
that this green rectangle that I have focused is the thing
that VoiceOver would focus. Here, VoiceOver in a desktop environment in an iOS environment is
gonna be the same thing, because you’re gonna see
that green focus rectangle in my slides on some of my examples. All right. Let’s talk about the
UIAccessibility Protocol. There’s a lot of different
properties of the UI, oops. Sorry, I have this focus
rectangle sticking around. I want to get rid of it real quick. Perfect. My Accessibility Inspector
was being obnoxious. All right, the UIAccessibility Protocol. There’s a lot of properties in
the UIAccessibility Protocol. We’re gonna talk about two sets. We’re gonna talk about basic
ones, accessibility labels. If you’re familiar with
the robustus protocol, there’s this idea in accessibility of name, role and value of a thing, name being a simple description, value being things like state
or the number of a thing, and the role being, is it a switch, is it a link, is it a button? Name, role and value. The accessibility label
is normally going to be the name of the thing. It can represent the state or the value. But it should not
contain role information. The accessibility hint is
frequently unnecessary. I would say that any time
the accessibility label contains the name of the thing, the accessibility hint is frequently gonna be additional information. But if the label is the thing that is controlling the state
or the value of the thing, then a hint is a really sensible place to put the name of the thing. I’ll go over that example in
detail in just a little bit. Accessibility traits communicate role information to the user. They can communicate what the things is. There are a lot of roles that don’t exist within the traits. There are a lot of roles
that iOS will attempt to calculate programmatically. And that’s okay. We should embrace the operating system’s wanting to do that. But accessibility traits
just change the way VoiceOver interacts with control and communicate information to the user. There’s two categories. Also, final note, when you’re talking about labels and hints, consider the case and punctuation of your labels and hints. It changes the way VoiceOver
announces your control. Adding exclamations or
periods to the end of things, capital letters at the beginning, helps VoiceOver read things
in a more natural way. With all of that being said, let’s dissect a VoiceOver announcement. Remember that green focus
thing that I was talking about when I brought up the
Accessibility Inspector. That’s what we have here. So VoiceOver, it’s difficult for me to share the screen content. I actually have to hold my phone up here. So what I’ve done is, I’ve said, here’s this green focus thing. This is what VoiceOver focus. And I’ve added the
announcement to this (mumbles). So when we focus this
control, we get increment, button, there’s a pause,
and then Add 1 to Tomato. So what we have here is, we have increment is the accessibility label, button, the announcement of
button comes from the fact that we’ve added the
UIAccessibility trait button to it, and Add 1 to Tomato is
the accessibility hint. That is that hint information. Notice, remember, in the screen before what I said was, we can use the hint to contain the name information. In this case, the accessibility
label is increment. The accessibility label is the, not value, but it’s
what the thing is doing. The button, it’s a button. We need to tell it what
we’re incrementing, which is the name of the thing. So we have Add 1 to Tomato. We need to associate it with that name. There’s kind of two names. There’s the increment name and then there’s the tomato name. And those two things
need to be associated. Picture this increment, decrement thing within a series of, all right, so we’re incrementing tomatoes. So we have an increment
button for tomatoes. We have an increment button for beans. We have an increment button for potatoes. You need to know what the thing
is that you’re incrementing. And it’s really sensible to store that information in the hint. Let’s talk about all of
the accessibility traits that we have access to. Button, link, image,
that’s just gonna cause literally at the announcement
of button, link and image to those things. In this case, remember I said
we’re gonna separate traits into two categories. Here we’re talking about things that change the role of the thing. Selected, it’s going to
announce as selected. Unlike these things above,
selected is actually going to announce first. It’s gonna say, “Here,
one of these is gonna be,” like, “a puppy button” or “a cat button.” This is gonna say selected first, it’s gonna say selected cat. Not enabled is going
to announce as dimmed. You use that for things that
you’re temporary disabling. StaticText is a really interested one. This is actually going
to hide an announcement. So if you’ve had a UI Text field where somebody would enter text in it but you’ve hacked the UI Text field to just present some static text, you can cause a UI Text field
to announce as a UI label. So instead of saying,
“Hey, this is editable,” whatever, blah, blah, blah, it’ll just announce
exactly like a UI label. It’s actually funny because
you’ll see StaticText trait applied to things UI labels. That’s completely useless. It only changes the
announcement of a UI Text field. Header announces as a heading and bring us to our accessibility traits. I will go back to this slide, I promise. My slides are out of order by one. Let’s talk about the
behavioral traits real quick. Header becomes a navigation landmark. So what it allows you to do is, it allows you to navigate to things. Instead of navigating by single controls, you can navigate things by headers. Headers are really important. Summary elements on the
traits share information on ViewController load
or new view presentation. Don’t overuse this trait. Adjustable says that the
thing can be changed. It changes the way it interacts with it so that when you’re on
one of those controls, you can swipe up and down and
modify the value of the thing without actually interacting
with the control directly. Think of like a value control
or something like that. Allows direct interaction is
gonna pass screen gestures directly through to the view. Think like a signature field. You have an area where
the user needs to be able to draw their signature on a view. This is where you would
use something like that. Do not overuse that property, as well. You can really easily take that and completely break the
accessibility of your application by applying it to too global of a view. Updates frequently, don’t announce every change to this thing. If you have like a counter of some kind that’s changed (mumbles) by second, eventually the very
announcement is gonna be longer than it takes, if it’s updating more
frequently than it takes for the distance for the
link of the announcement, you’re gonna end up with announcements that step on each other’s toes and you’re gonna have this long string of announcement that stacks up. Updates frequently is
just gonna announce it every time you have a
pause in the announcement. Like, let’s say you’re counting
to one, two, three, four, it’ll say, like “One,”
and then if I say, “Four,” then it might say (audio skips), depending on how long it
takes to get to announcement. Don’t let me interrupt
other announcements. Plays sound is gonna
do very similar things, but just in reverse. All right, let’s go back to this dissecting of VoiceOver
announcement thing. Remember the screen back here where I had this increment control that has button, Add 1 Tomato. Look at this little,
oops, I went too far back. Look at this little thing here. Notice we have zero tomatoes right now. So we’ve dimmed, we’ve disabled
this little minus control because we can’t have
less than zero tomatoes. So notice the announcement here. Decrement, dimmed
button, we got the pause, and then we announce the
hint, remove one tomato. It doesn’t make sense to
remove a tomato from zero so we’re now seeing this dimmed. And the only thing that’s changed here is, we’ve added this UIAccessibility
trait, not enabled. That’s what’s causing that
announcement of dimmed. That is how a user, an
iOS user, would expect a disabled control to be announced, is to have it announced as
dimmed, given this trait. All right. The last kind of basic property,
isAccessibilityElement, allows us to control what is focusable by assistive technology. So if it’s yes, the element is focusable. If it’s no, the element is not focusable. Pretty straightforward. But there are some caveats. We have a view here that has two buttons, and I’m saying that it’s broken. The reason it’s broken, let’s go over it. We have this not focusable button and the not focusable button. What we’ve done here is, so the first button is
an accessibility element. The second button is an
accessibility element. They share the same superView, and that superView is an
accessibility element. What happens is, we can
only focus that superView if we were to go in here and swipe right or hit tab or whatever, we’re not going to be able
to focus these two buttons. VoiceOver is going to skip them and, since those are buttons, it’s going to activate the middle point, which is gonna be nothing. So actually in this particular case, neither of these is going to be action. What we want is to get rid of that. We don’t want the
superView to be accessible or what we wanna do is use the UIAccessibilityContainerProtocol to focus these individually. Really in this case
it’s just easier to say that the superView is not
an accessibility element. UIAccessibility Protocol, other. There are some other properties that I wanna talk about real quick. Accessibility frame, you can
control the focus rectangle. You don’t actually have to
have the focus rectangle be what the focus rectangle is, what the actual view rectangle is. You can make it bigger, which
allows you to do cool things like take a switch and have it wrap over its visible (audio warps). Apple uses that all the time
in their design (audio warps). It’s the idea of wrapping
two controls together into one control. When you couple this with the accessibility activation point, remember that the VoiceOver by default is gonna activate the middle
of your accessibility frame. If you couple this with the
accessibility activation point, you can do really powerful things with grouping active controls
with their visible labels, and you can do them really simply. You just say, “Hey, let
this control encapsulate “this entire thing” and activate the point where the active control is. Really easy idiom for grouping information with active controls. Accessibility navigation style, this is specific to switch
access or switch control. It allows you to do control groupings. If you’re familiar with switch
access or switch control, what’s gonna happen is,
you’re gonna scan over things and you can control the way those things are grouped together
to make it more efficient. If you have to scan over
every control one at a time, it gets a little frustrating. You can do control groupings and scan over things
in groups much faster. View is modal says, “Don’t focus things
outside of this container. “Don’t focus sibling
containers of this thing.” That’s useful for modal dialogs. You can hide accessibility elements, all of the accessibility
elements within a dialog. If your thing is invisible
for a period of time, you can say, “Hide all of these things.” You can group children. And there’s a lot of
other little properties. I’ve covered the important ones. UIAccessibilityNotifications. This is a separate API from
the UIAccessibility Protocol. It allows us to do cool
things with dynamic elements. What you’re gonna do is, there are two types of
accessibility notifications. There are those that you
send from your application and there are those that you listen to from UI Kit to your application. Basically what you do is, you say UIAccessibilityPostNotification, and then the name of the
notification and an argument. Let’s say I send the screen
changed notification. It’s the most complicated of the ones that you are expected to send. When you send a screen
changed notification with an argument, what’s gonna happen is, it’s gonna play the bleep bloop sound. That’s the thing that tells the user that the screen has changed. If your argument is a string, it’s going to announce the string. If your argument is a view
within the view hierarchy, it will focus that and
then it will announce it. Actually with the screen changed, you can supply a null argument which is going to focus the
first view in the hierarchy that isn’t a system view and then behave just like
providing a view argument. It’ll focus it and then
it will announce it. Layout changed, same as
above, no bleep bloop sound. It’s only part of your layout has changed, it’s not gonna play the screen
changed bleep bloop sound. And announcement is gonna behave the same, except it’s just, you can
only supply the announcement, you can only supply it
an argument of string. Ultimately, you could use
UIAccessibility layout changed to accomplish the same
thing as announcement and nothing is different. There’s speculation even
within the Apple developer docs that this will force some type
of rerendering for VoiceOver. I’ve never actually seen an example where calling layout changed
or not calling layout changed changes the way VoiceOver behaves. I would love if somebody
would show that to me if that exists, but I
don’t think that it does. I have tested with the
UIAccessibilityContainerProtocol extensively and have never
seen any of that behavior. So don’t worry if you’re not
posting these notifications. They only control where focus
is and control announcements. They’re not required to call
them as a part of anything. Your VoiceOver will still work even if you’re not using them. Let’s talk about the notifications that you can listen for. There’s a lot of them. Most of them I don’t like. I think when you’re
talking about notifications that you’re listening for,
there’s a lot of edge cases and there’s a lot of
things that I think really, you gotta rethink your design if you’re listening to
those notifications. Some of them are edge cases. But they’re really far edge cases that I don’t think are worth digging into. Element focused, when
a new element receives accessibility focus, you can
know that it’s received focus. That’s valuable for
tracking where focus is within your application,
especially if you have things like where you’re gonna be tempted to post those UIAccessibility
screen changed notifications or layout changed notifications. Knowing where focus is is important because you may want to put focus back onto a similar element
if the screen has changed or if the layout has changed. You wanna know where focus is so if you haven’t removed that content, you can put focus back onto that content. Announcement did finish, if
you posted an announcement of some kind from one
of those notifications that we talked about, you
can know that it’s done. Like I said, there are a
lot of other notifications you can listen for that are
pretty edge casey in their uses. You can dig into those on your own. UIAccessibilityContainerProtocol. This allows you to control the way your containers are interacted with. The common use case, for example, would be a custom control that doesn’t
have its own views in it. You can create custom
accessibility elements and control the way that
they’re interacted with. To do that, you just
say, “This is how many “accessibility element are in my view. “This is how you get the accessibility “at a specific index. “This is the index of that element “and these are the
accessibility elements.” You can use this to create custom controls and customize the way VoiceOver interacts with your application. You can also use it, so one of the things that’s cool is, there’s all of these properties. VoiceOver, the way VoiceOver pings all of these properties is, it’s calling these functions constantly. So if you’re curious about the way VoiceOver is interacting
with your application or the way VoiceOver is
interacting with your controls, you can apply these properties to them, let them be default things, like have element count just be the number of children
in a view, return that, but then you can log in from
information from that call to see how all of these
things are working. Like I said, back in my disclaimer about the UI layout changed
notifications and stuff, you could really glean
a lot of information about the way the accessibility
protocols and stuff work by logging things from these properties. Now I got about 10 minutes left. What I wanna do is, poke around a few
applications in VoiceOver and show you what some
of this stuff is doing. I have my AirPlay set up. Let’s look at the
accessibility of a few apps. Where did my AirPlay server go? All right. All right. Calm down here, there we go. This is AirPlay server. This is, you are looking at me interacting with, this is
actually the app store. What I’m doing is, I have VoiceOver and I’m swiping right. When it comes to an announcement, I will help announce it for you. Let’s focus one of these things. Notice here, one of the things that I said about VoiceOver is, you
could collect things. You can collect views. Notice how we’ve wrapped this entire view in one accessible object. This is good. We’ve associated this data together. There’s a lot of different
data points in here that would need collected together. Hold on, let me switch. We have all of these views
that need collected together. For example, this rating star, right? This rating star, that this rating belongs
to this particular review, is important information to collect. Because if we have 40 hours ago Friday, these two things, it’s not apparent whether this is associated
with this one or this one. There’s some ordering issues. If you were to focus these individually, there’s ordering issues. Is this heading here belong up here, or does this heading belong down here? So what they’ve done is,
they’ve removed that concern. They’ve grouped all of
this information together, which is great. All of this information
is grouped together, we know where it belongs, and there’s only one
actionable thing here. This grouping is really, really good. Now, that being said, we can, let’s see if we can find, I found the example. What happens is, if you have
one of these rectangles, and I can’t find the example of it. I wanna get to the more reviews. Or maybe let’s go look at
a different application. Let’s try, let’s try. Here we go, this is interesting. Here we have the same grouping going on. But we have multiple act-,
notice they’ve separated this. Apple’s doing another smart thing here. We have two actual controls. So there’s data that we
wanna associate in here. We have the cost of the thing and we have the name of the
thing over here to the left. But this thing over here
to the left is actionable. When I activate it, it’s
going to do a separate thing than the price is going to do. So what happens is, we have to focus those two things separately. We can’t wrap these two
things into one container because there are two
actionable things here. So what we have to do is, we have to create that association, this association where I’m paying $7.99. I need to know that I’m paying
$7.99 for NBA 2K18 games, not $7.99 for playing Incorporated. So what we have to do is, we have to associate that differently because we have an
actionable thing to the left and an actionable thing to the right. Those have to be focusable individually. And in this case what they’ve done is, I can’t hold the announcement up to you, but basically there’s a hint
that reads out afterwards that says, “Hey, this is the thing that.” The proper way to do that would be having the hint that
reads out afterwards say, “$7.99 paid for NBA 2K18 games.” That’s the way that you
should mark this up, creating that association
through the hint. Then what happens is, developers, power users of your application, because there’s two different
types of data that we have. We have that data that we want right away. We don’t wanna wait for the fact that this is $7.99. That’s the new information here. By placing that associating data, by saying $7.99 for NBA 2K18 games, users that are familiar
with your application don’t have to wait for that pause. They’ll get used to the fact that this is a list of costs with games. They’ll know that,
they’ll understand that. And they will learn to know
that the cost is to the right. And they’ll know, “All right, all right, “I’m going down this
list and swipe it quick. “All this is $7.99. “I don’t need to wait
for the NBA 2K18 hint. “I know that the cost is to the right “and that’s where it belongs, “because I’ve used this
application before.” So you can program. By putting that information in the hint, you’ve separated power users and standard users of VoiceOver. Power users are gonna know what it’s for, that it’s for the things to the left. That is all of the information I had. Oh, my AirServer just died. Perfect timing, it’s like
I did that on purpose. I would like to know if you
guys have any questions. But thank you for listening. Don’t forget, I am on StackOverflow. Like I say, if you tag
things with VoiceOver, TalkBack, Android, iOS, accessibility, I will guaranteed read those questions. Whether or not I have time to
answer them is questionable. Also, Native Mobile Deque Blog, we’re starting these series of blog videos where I do, much like I did
today here on the webinar, but more technical, really
specific chunks of information in about five minute snippets. And then where to blog that, we’ll blog post down the bottom, emphasizing the important parts, adding time slots to connect
to spots in those videos, so on and so forth. But how to implement
really specific things, for example how to
implement those groupings that you saw today in VoiceOver, how to implement that in a
really easy accessible way, how to implement the
idea of activation points so that you can group active controls with some informative
content grouping together, so on and so forth. All of that information is
stuff that I have coming up, showing you how to do that
is planned in this blog. But that is all I had. I am ready for questions. – [Laura] Great. Thanks, Chris. Got a question here from S.K. asking, “Where do you find the
Accessibility Inspector “that you were demonstrating?” – [Chris] Ah, perfect. Yeah, I did rush through
that part a little bit. So let me show you. Let me bring this down here. Here is Xcode. This is my Xcode environment. You go up here, Xcode,
Open Developer Tool, Accessibility Inspector, and that opens up this window here. You can use this. When you’re connected over
here, you can use this. You can analyze Finder, you can connect it to any
process you have running, including the simulator. Actually, if you have a device
connected to your computer, an iOS device connected to your computer, you can actually connect it
to your iOS device, as well. Now, an important thing to point out about the Accessibility Inspector is, it is sharing these. When you’re looking at an iOS application, it’s sharing these with
modified, kind of fake, desktop versions of them. So you’re not gonna get
all of the properties that you’d get from iOS. You’re gonna get desktop
formatted versions of those properties. But it’s still really, really useful. Just don’t take all of
its information to heart. Sometimes you’re gonna get slightly misleading information from this, in an iOS environment. So when you’re looking
at the iOS simulator, that information, it’ll
be misleading sometimes. For example, traits. Some of the traits informations
don’t come over 100%. – [Laura] Got it, okay. I have another one here. “Explain emulation and
why VoiceOver on iPhone “is not captioned.” – [Chris] Why VoiceOver on iPhone is not captioned, and emulation. I guess I don’t really understand what, I feel like if I were to
answer that question right now, I might be interpreting
the question wrong. For example, the reason I’m
misunderstanding the question, when you say say VoiceOver and emulation, there is no VoiceOver on the
simulator or the emulator. There’s a actually
reasonable line distinction between what’s simulation
versus emulation is even doing. So when you’re talking about
captioning of something, as in supplying text
strings for the words, that’s just an option
that they decided to omit. When VoiceOver delivers strings of text to the text to speech engine,
it delivers those as strings. There’s no reason that that information can’t be supplied somewhere. Although again, how that
links to the simulator or emulator voices it versus
an actually iOS device, I’m not clear on that question
so you’ll have to clarify. Maybe that’s a good thing
to get an email about. If you send emails to Laura, she can pass them through to me. I just need more details there. – [Laura] Great, so another one here. “Is there API functionality “for the use of color contrast ratio?” – [Chris] Is there API functionality for the use of color contrast ratios? There is no API that will allow you to easily calculate the
color contrast of two things. I do have that built into our native mobile accessibility testing stuffs. That’s a test for iOS. But as for, like, APIs
that allow you to do that, basically you have to capture
the color of the thing and calculate that yourself. There’s a lot of weird things
to do with alpha blending and default alpha states of backgrounds that get all the way
back to whether or not your main view is opaque or not, and a lot of weird considerations there. I definitely don’t recommend
you doing that by yourself. You should definitely check out our test for iOS to do that. But there are APIs that
allow you to change things or respond to changes in color. If we go here, UIAccessibility, and that’s actually some
of those notifications that I did not look at. Oh, no, that’s not the (mumbles). Trying to find UIAccessibility
notification names. I hate Apple’s documentation on this. It’s very difficult to navigate. Notification Dictionary, here we go. Oh, these aren’t even
all of the notifications. (mutters to himself) No, it’s hard to find. If you go to the list of
UIAccessibility notifications, you will find things in there like the ability to
respond to darker colors and the ability to respond
to inverted colors. Those are some of those notifications that I didn’t listen to. What that allows you to do is, if someone is asking for darker colors, you can make those darker color changes more dramatic than the
default system properties. Like, let’s say that you
have a pair of colors that don’t invert well, you can manually supply the
inverted color for those. So those allow you to interact with things that control color contrast. But know there is no property
that allows you to say, “Hey, does this have good contrast,” or even, “What is the color
contrast of my thing?” In fact, there’s a lot of different ways of calculating color contrast. I don’t even think the world agrees on what the color contrast
calculation should be. When I say color contrast calculation, I refer to the luminance calculation as supplied by the W3C and WCAG 2.0. And no, there is no API to do that. – [Laura] We have another
question here from Joanna, asking, “When you’re opening
up the Accessibility Inspector, “there seemed to be
another option called Demo. “What is that? “Is that part of Xcode? “Is that used to demo
accessibility support at all?” – [Chris] Option called Demo. I am not sure, within the realm of the
Accessibility Inspector, I’m not sure what it is that that question is referring to. If there’s a slide or a specific spot that the word Demo pops up,
would be happy to answer. But I’m not sure what that
question is asking exactly. – [Laura] Okay, move on. “In the app store view,
would the app name, “photo and rating
VoiceOver at the same time, “since they’re grouped together?” – [Chris] Yes, that’s a good question. I have my headphones on. Let me take my headphones off real quick. Actually, just partially. All right. I am bringing up my (vocalizes melody). I have my AirPlay server
with my device on. I believe the question
is, so the app store, so we have this control
grouping here, right? So the question is, does it
announce just the picture, does it announce everything? What you wanna do when you do this, I’m gonna get what the correct answer is. I actually can’t hear VoiceOver. I have it turned all the way down so that it doesn’t
affect you guys too much. Basically what it should announce is, basically what we wanna do is, we have informative information. We have this rectangle that is saying, “There is informative information
within this rectangle,” and we’re also promising that there’s only one active element. If there’s more than one active element being focused at a time, that’s always gonna be an issue. But so we have informative information and one active control. And we want to announce
all of that information. I believe Apple gets this correct. But basically the information
that we have here is, one of some odd number. Where we focus the first of something, Minecraft, this is decorational, actually. Unless it includes some text, the image that that represents is, oh no, I went too far. The image that this represents in this case is decorational. So you have Minecraft, we have Games, we have four of five stars, 10,089. I just turned it up and I’m listening to it in the background, and they do in fact get that right. But yes, you want to,
anything that that VoiceOver focus rectangle encompasses,
you wanna announce. The reason I say that, I actually say that for a really specific reason. When you’re presenting
information to VoiceOver, you have to keep in mind two users. You have a blind user
who’s gonna be confused by things like the grouping
thing that I talked about, how we have this 6.99 and
the ordering issue here. The 6.99 applied to the game before it or to the game after it. But then you also have
the partially sighted user who is using VoiceOver
just for the readout. They don’t need the navigation help, they don’t need the
help activating things, they don’t need the help
finding these things on screen, but they need help understanding that maybe they can’t read or maybe they have difficulty reading because their vision is blurred. But they can still see. Like, they can see this label here and they can see this label here. So if you have a focus rectangle encompassing all this information but you omit this five star
thing and just say Minecraft, you’re gonna confuse a
partially sighted user. So what can happen is, let’s say that games was
individually focusable, but we still have this
focusable thing here. You’re gonna have these
weird nested touch targets, and someone who is just poking around and exploring with touch to explore is gonna have a really bad user experience doing something like touch to explore and then having it jump
from Minecraft games, Minecraft games, just because
their finger is touching just a slightly different thing. And they can actually see that happening. That’s gonna be a really
bad user experience for a partially sighted TalkBack user. So yes, we want these,
all of the information thata VoiceOver rectangle encompasses, we want to share together because we’re not just
concerned about blind users, we’re also concerned about
partially sighted users. And all sorts of other good reasons for grouping all of that
information together. – [Laura] We have another
one here from S.K., clarifying when he was
referring to testing. He says, “It’s for those who
are deaf testing on iPhone. “There are captions on MacBook VoiceOver “but not on the iPhone, “and for TalkBack on
Android, it is captioned. “So, trying to find a
solution for that problem.” – [Chris] Yes, yup, I understand. That is a good question. Actually, one of the things
I said at the beginning was that I find demoing on
VoiceOver frustrating because I use that caption
when I’m doing demos so that you can see
what’s being displayed. So yes, I share that frustration. It doesn’t output on
VoiceOver, but it will output to alternative devices
like braille boards. I haven’t actually thought of
it in terms of this before, but I’m fairly certain you
shouldn’t just be able to, like, if you can have
a braille board device that will grab onto that information, you should also be able to hypothetically have some other type of technology that can just present that and display that information textually. Why that’s not built
into VoiceOver directly, that is a question for
Apple and not for me. – [Laura] Thanks, Chris. Another one here, “Imagine
you have a modal view. “If you dismiss it, the VoiceOver focus “always goes behind it in
the same frame it has before. “How can you move the focus at
the right position you want, “for example at the
beginning of the screen?” – [Chris] Yup. If we go back to my presentation, those notifications that you can send, those notifications that you can send from your app to the system, the screen changed,
oops, I started clicking, expecting things to highlight. The screen changed notification and the layout changed
notification are what you want. The argument here is gonna be the view that you wanna move focus to, so you wanna say
UIAccessibilityPostNotification/ LayoutChangedNotification. So if what you have is a
modal that’s disappearing, I would say that the screen
hasn’t updated, right? The content that was behind that modal is still roughly the same. So you probably wanna post the
layout changed notification. You wanna say
UIAccessibility/PostNotification/ UIAccessibilityLayoutChangedNotification, and the view that you wanna have focused. If that view is the first
view, just find that view, supply it, push off this notification. There are race conditions you can get. So if that behavior doesn’t work, make sure that you’re doing
that at the right time. You don’t wanna post that notification as your modal is disappearing. You wanna make sure you
post that notification after your modal has completely gone. And yes, that may mean that focus goes somewhere arbitrary first and that you hear this
partial announcement. That’s a weakness of the APIs. But what I would say as
subnote to that is that, in general, unless you have
some type of custom modal, the behavior of a modal
and where focus goes after a modal is dismissed is something that I would normally
recommend just leaving alone. There may be something
custom that you’re doing or some other circumstance
that’s changing this assertion. But if you’re doing modals
the way iOS has you do modals, you should also let iOS determine where focus goes afterwards. Just those default behaviors, if you’re doing something
contrary to the way everyone else in the world is doing it, that is going to tend
to be less accessible than perhaps the thing that you’re reading from some set of standards
or something else. If you’re doing custom
things, absolutely control it and push it somewhere
and have that link up with the default
VoiceOver iOS expectation. – [Laura] I have another one here. “How can you get VoiceOver
to read accessibility traits “like link throughout continuous reading?” – [Chris] Like link
throughout continuous reading. I don’t know what the extension of, I assume when you say continuous reading what you’re meaning is the, like VoiceOver is just reading
all of the screen content. My answer to that would be, you absolutely should
not get it to do that. The reason I say that is because to do so would require you to include
that role information within the accessibility label. So the way you would fix that, the way you would do that,
is to include the word link within the accessibility label (mumbles) add that role information to that thing. And you absolutely should not, Apple has in, for whatever reason, decided that the link information
should be omitted from the continuous anything. Basically what they’re saying is, all of the information
on the continuous read is this, quote, informative information. The role of a thing, the hint
of a thing doesn’t matter. We’re not reading out
all of the properties. We’re just gonna read, I think, and when you think of that
in terms of traits and hints, where you say, let’s go back to this dissection of an accessibility announcement page. What is not happening
on a continuous read is, we’re not focusing all of them and saying, “Decrement,
dimmed,” long pause, “remove one tomato.” That’s not what a continuous read is. A continuous read is just saying, “This is the information
presented on this page.” So Apple has said, “Hey,
let’s just go there “and say ‘decrement,
increment, zero tomatoes, “‘number of tomatoes, number of beans, “‘decrement, increment.'” The alternative would
be, “Number of tomatoes, “decrement dimmed button,” pause, wait, “remove one tomato, increment
dimmed,” pause, wait, “add one tomato.” That’s not what the
continuous read is doing. The continuous read is saying,
“Here’s the information. “Go in and scan these things.” So my answer is, how you would do it is, you would include the link information in the accessibility label. You absolutely shouldn’t do that and should drop that requirement. – [Laura] Another one here. “iOS supports accessibility rules. “But this is not supported in Android OS. “What is the best way to
handle roles in Android OS?” – [Chris] (chuckles)
This is an iOS webinar. You should watch my Android webinar video. But basically, I think I can answer that in a cool Android iOS-y way. When you’re looking at
the way iOS does things in terms of accessibility is, they kind of steer you in these specific iOS
ways of doing things. When you’re looking at the difference between iOS and Android, iOS keeps you in this confined, safe environment and they give you a lot of controls over these simple properties. What Android does, the
Android philosophy is, “Let’s give you a few
really flexible properties “and let you do whatever
you want with them.” So in iOS what you have is, you
have buttons, links, labels, and then it also plays, there’s some secondary role information that plays in the idea of ear-cons, you know, like the ability of, you know, where it says, like,
“Double tap to press.” Well, any sign that says
“Double tap to press,” it’s sharing the role of the thing, and that role is that it’s actionable. In Android what happens is, they try do to a lot of stuff for you. They try to calculate
the role of the thing so the role is contained in the fact that you’ve used a button and that you used the
Android.widget.button or Android.widget.whatever. The role is the actual semantic
class type of the thing. In iOS what happens is, they say, “We will supply defaults for that. “So, like in the interface builder, “if you plop a button down, “that button is going to automatically “have the trait of a
button supplied to it.” But that doesn’t keep
you from removing that and using a button as a
purely informative thing. You can have buttons that
aren’t actually active, although if you’re
removing it temporarily, that should be dimmed. But if you permanently remove, if you’re using a button
just to present information, there’s nothing keeping you from saying, “Remove the accessibility trait of button. “Remove all of the actionable things “that make a button actionable,” and using it to present information. But that’s just kind of
the fundamental difference between iOS and Androids,
is iOS is gonna keep you in this bubble and Android is gonna say, “Hey, let’s do as much for
you automatically as we can. “But if you want to go out and explore “these super flexible APIs
and just overwrite everything “with the content description, you can.” You shouldn’t, but you can. – [Laura] “Have the simulator VoiceOver,” sorry, we’re gonna skip to another one. “If you don’t announce a link, “how does the user know
that a link is there?” – [Chris] They don’t. A link is a very
specifically used property. Basically what a link means to me, so when you’re talking about the web, web developers and accessibility
experts will argue forever over what the definition of a link is. When you’re talking about iOS, the reason the trait iOS
link exists, in my opinion, is because phones, when we have a phone, when we have an application
and you interact with a button, it’s gonna keep you within the context of your application. So when you’re scrolling
around and you tap a button, your application context doesn’t change. Your view doesn’t change. Maybe you go to a different
tab within your application, but you’re still within the same package. A link is gonna be the opposite. When you activate a link, you can have links that take you to things that aren’t webpages. You can have, like for example, it’s becoming really
popular to have autolinks on, like, phone numbers. So to me, what a link
means within the context of a native application
within a mobile environment, is anything that you click
that’s gonna take you outside of your application
package’s context and to something else, because that is a very
important change of context to communicate to the user. So Apple did something really
smart with that trait of link by allowing us to control it, because by that definition,
there are a lot of things in iOS that are links
that don’t have that, quote, link appearance, that
don’t have the underlined text, that don’t have the http.www.whatever.com. There are a lot of controls within iOS that are gonna do that
behavior that I described, where they take you out of
that application context. And that, to me, is what a link is in iOS, not your standard web
definition of a link. So for your standard web
definition of a link, you gotta add that property. You gotta make things
separately actionable and apply the trait link to them. That’s the only real way to do it within a native iOS environment,
is to get that link. And, I mean, of course you
could say, accessibility label, add link to that. We’ve already discussed
why adding role information to things is a bad idea, unless it’s actually in
the actual visible text. But the only real way to do it properly is to make sure that,
when that link is focused, when that thing that is actionable, that is gonna take you out
of the application’s context, that thing needs to have
the trait of link on it. – [Laura] Great. We’re reaching the end of our time here. We’ll make sure to get to
the rest of the questions, hopefully through email. You can email me and I’ll
definitely loop in Chris, to make sure we get all
those answers question. Thank you all for joining
the webinar today. We’ll be sending out the slides and the recorded version of this webinar to everyone within the next 24 hours. I’ll hand it over to Chris to wrap up. – [Chris] I’m good. Thanks, everybody, for listening. Like I said, feel free
to follow my blog videos and ask questions on StackOverflow. Hopefully you all got
something useful from this. But thanks for listening.