Using the web with limited vision is awful. I just didn’t realize how awful.

A year or two ago I started reading some of Ethan Marcotte’s calls for better web accessibility. Taking steps to make websites behave more predictably, play nice with screen readers, and follow some code standards all seemed like easy enough things to do. I thought they’d be especially easy to do if we thought about it before developing a new website. Like wiring a house before putting up the drywall.

As one example, I knew that screen readers just read down the screen. Unlike humans with visual acuity who quickly learn to ignore the right column of ads or the navigation menu on each page, screen readers can’t discern what’s junk and what’s not. So they repeat everything on every page. Putting in a quick “skiplink” code allows the screen reader to skip narrating the navigation menu of a site every time a page loads.

Easy peasy! Throw in some ALT tags for images to describe what they are and we’re all set.

Problem is, none of this works like I expected.

On Friday I went to a client’s home to help them setup anything I could think of to supplement their vision. After a sudden surgery for macular degeneration, the temporary side effect left them unable to see well.

They had three devices: A Windows 7 desktop PC, an iPhone 8 with a home button, and a Windows 10 HP laptop.

iPhone Accessibility with VoiceOver and Zoom

We started with the iPhone first. I went into the accessibility settings thinking if we just made the text larger it’d be easier to read.

For them, the text size wasn’t of issue. The issue with their vision is a large dark area that blocks light. Imagine trying to read a piece of paper late at night. By shining a flashlight at an angle on a screen, they could make out the words and elements on the screen. But it wasn’t sustainable. Using the device with one hand and holding a flashlight with another was difficult for an octogenarian.

For kicks, we tried using the phone with Zoom on. The phone was more difficult to use because it required three fingers to move around the screen. Instead of just enlarging items on the screen, it turned the device into the world’s lousiest magnifying glass. By using three fingers to scroll through a “window” in all directions, and using three-finger taps to toggle it on and off, you couldn’t see the small screen to read with any ease because your fingers are constantly in the way.

You should try it yourself. Go to Settings > Accessibility > Zoom and toggle it on.

Next I tried VoiceOver. VoiceOver reads what’s on the screen so you can listen to it instead. For all the attention Apple spends promoting their accessibility work, I had high hopes and expectations for this.

Turning VoiceOver on immediately displays a warning that enabling it will change the paradigm of how you use your device. It’s not kidding. It’s like using an entirely different operating system for the first time.

  • VoiceOver uses the pre-Siri robotic voice you remember from the early 2000’s on the Mac. Harsh, clipped, and fast. iOS 14 has new options for download that might sound better, but on an old iPhone 8 with limited storage, it was a big ask.
  • Enabling VoiceOver means each tap selects an object, like a link or app icon. VoiceOver then reads what you tapped on and if you don’t quickly do something else it keeps reading “Double tap to activate. Or use Force Touch for more options.” This phrase became insufferable. Imagine after clicking and dragging something with your mouse your computer always said “Let go of the mouse to drop. Or press CMD for more options.” You’d probably unplug your speakers.

Couple this with mobility and dexterity problems

If you’ve ever watched an older or beginner use a touch screen before, you’ve probably noticed they don’t hover their finger over the glass. If you pick up your phone right now and launch Messages, notice how closely you probably keep your thumb or index finger over the keyboard and glass. You’re likely within centimeters of the surface ready to tap and type. Older people don’t do that because they either don’t think to, are afraid to touch it, or have shaky hands that trigger superfluous actions.

When my client tries to open an app, they’re used to holding the phone in their left hand and then striking with their index finger from 4-5 inches away. I’ve noticed this in a lot of beginner screen users. It’s an almost violent interaction method as if the glass is made of lava. This large gap between finger and screen is like dropping a missile from low earth orbit. The chances for missing are high, and the time it takes to reach the surface is longer.

With VoiceOver enabled, the interaction to open Messages went like this. Items in brackets are the robotic device voice. Keep in mind the first icon the home screen is FaceTime:

  1. Push home button to enable screen.
  2. [FaceTime. Double-tap to activate. Or use Force Touch for more options.]
  3. “I don’t want FaceTime.”
  4. “Select Messages with one tap,” I say.
  5. Taps Messages after looking for it.
  6. [Messages. Double-tap to activate. Or use Force Touch for more options.]
  7. Taps Messages again
  8. [Messages. Double-tap to activate. Or use Force Touch for more options.]
  9. “You need to double-tap that now to activate,” I say.
  10. Taps once and then lifts finger for a second tap. [Messages. Double-tap to activate. Or use Force Touch for more options.]
  11. Second tap lands. [Messages. Double-tap to activate. Or use Force Touch for more options.]

This went on for a frustrating amount of time. The time between his first tap and second was long enough the device didn’t register it as anything but two single taps, not a single double-tap.

Adjusting the tap time, again in Settings, helped a little. But their poor vision and unsteady hand meant high errors in where taps landed. A double-tap aimed at Messages meant one tap for Messages and another, mistaken second, tap for Phone.

Even when I tried using the iPhone with VoiceOver enabled it proved frustrating and annoying. Going into Messages meant the reader kept saying things that really weren’t helpful most of the time. It sounded like this:

  1. Tap Messages [Messages. [Double-tap to act—]
  2. Double-tap Messages [Messages.]
  3. [Edit]
  4. [Messages]
  5. [New Message]
  6. [Message from: John Doe…]

You’ve probably never noticed, but if you go into Messages you see the “Edit” button in the top left. The heading “Messages” in the middle, and the “New Message” icon in the top right.

VoiceOver reads these each time you go into the app because they’re the order they appear. It’s worth noting, too, that if you have unread messages, VoiceOver will say [Messages. 1 new unread message.] when you tap on the icon from the home screen. And again when you’re in the app before it starts reading.

This behavior was the same on their phone, which contained 1,257 unread items. They, like me, never deletes their voicemails because I just wait for the text message transcription to pop in.

The transcriptions were a nightmare. VoiceOver not only tries to read the transcription, but it does so while playing the voicemail. A significant bug that rendered it completely useless. Listening to a robotic voice read a broken transcript where every third or fourth word is wrong and the actual audio of the voicemail at the same time made my client shake their head in frustration. “I can’t use this.”

“No, this isn’t useful or helpful at all.”

We turned it off. Literally all they wanted their phone to do was read voicemails and text messages. Text messages, which, by the way, were overflowing with political campaign spam.

Windows 7 and 10 Narrator equally frustrating

On their desktop PC running Windows 7, my client wanted to get similar help with email. Specifically in AOL, which they’ve used since the Clinton administration.

This proved equally frustrating. But was surprisingly better in some ways. Narrator performed the same on Windows 7 as it did Windows 10. It doesn’t seem like there were any improvements there. But it at least sounded more human out-of-the-box.

And like VoiceOver on iOS, Narrator read down the list of what it saw on the screen. But it wouldn’t read email. It can read the “chrome” around the window. Things like the File menu, Edit, etc. But not the actual text of the email.

For that, we tried a service that promised to speak email, but it was not designed for anyone who gets actual emails.

The first email was junk from a car dealer. My client, like me, does not own a car and does not want a car. But there we were, staring at an ad for a new Audi while they beamed a desk lamp on to the screen to try and read it.

As difficult as tapping was on their phone, clicking with a mouse was nearly impossible. They just didn’t know it. I sat and watched as they clicked helplessly on what was expected to be a close “X” in the top right. Instead, clicks were registered on all manner of banner ads, spam, links, and useless menu items.

Apple nerds have long complained about Apple shoving menu items into “drawers”, or otherwise requiring you to click a hamburger menu icon to show them. Microsoft goes the other direction and shoves every icon imaginable into ribbons and menus. I now believe Microsoft’s method puts too many accidental clicks in reach. For users with dexterity or visual impairment, all those icons and menus are landmines ready to blow.

Within seconds my client had opened several tabs and windows. They were on the way to unsubscribing, buying a new Audi, sending a new message, and organizing Contacts when they really just wanted to Exit the window. With each frustrating click, more random stuff just bounced around the screen.

On the laptop, it took me ten minutes to figure out why a checkbox was enabled that meant the speakers were listening to the microphone. This should never happen because all it did was result in a feedback loop of noise. There’s no reason for that option to exist, but it was on, and made the HP laptop’s microphone and speakers useless.

Windows 10 does have a Dictation app, but requires you to trigger it with Windows + H, and works like Siri on your phone: sporadically and with clipped commands. Knowing when it speaks, when you should speak, when to let go of the keys, and listening for its feedback were challenging. Significantly, like your phone, if you don’t know what you’re going to say immediately as it begins listening, you get superfluous input or nothing at all.

Better tech is possibly better, but the failures were not theirs

Nerds might say an ad-blocker, using Alt+F4 to close windows, buying a bigger screen or better device, using Gmail or Outlook or some other software would help. This probably would help. And for millions of people who have prolonged visual, mobility, dexterity, or audio impairment this is probably what they do. On-device screen readers can work in their native apps like Edge or Safari respectively, which is something, but they’re also an impossibly limiting and frustrating experience because of what we throw at them to read.

My client wants to work on their book, read text messages, and send emails. And wants to do so, hopefully, for a couple of months with a little help until his vision returns. My client’s not trying to build a house with custom power tools. They just want to send emails.

Waking up one morning and telling yourself we need $300 in Dragon Dictation software, (which only works with Outlook and Gmail), switching email to somewhere else, learning a whole new paradigm, buying a new microphone, or just getting better at holding the device are all wrong answers.

In my research, I learned Medicare allows people to request a Mobility Coach that can help train patients on this stuff at the request of a physician. Thing is, you have to get the referral, go through an intake process, and then someone will get to you in 8-10 weeks. At 84, time’s ticking, and 8-10 weeks is a long time.

This is not their fault and it’s not the fault of any user. This tech is lousy and hostile. It’s unfriendly and annoying. Web, app, and software developers have made equally unfriendly products. Everything from web pages to emails and software. And Medicare is surely overloaded on being able to quickly send some therapist to every house that needs it on demand.

I’m sure things are getting better, all things considered. Certainly the software on an iPhone today is better than it was ten years ago. But a lot of things are working against all of us.

For my part, I think about the emails we design for clients and the webpages we make. I think about the constraints I have in making those.

  • There isn’t a week that goes by that someone doesn’t want me to dump a scanned PDF document from an office copier on to the web. For screen readers, this is a non-starter.
  • Almost no day goes by without someone asking me to put together some infographic or cute thing to stick somewhere. Hand-drawn fonts and other unique typographic features might look attractive, but they’re a waking nightmare for many people.
  • People constantly ask to “Put these photos in our next email”. Those dozen photos not only bloat email bandwidth requirements, but they’re also seldom worth much even to fully-sighted people.
  • Shoving ads “up top” or in bigger spaces so people “will be sure to see them” seems easy enough to ignore with eyes. But for my client, this not only made the web tougher to use, it also wasted a lot of time. The myriad spam texts and emails would have easily taken an hour to work through by listening. You or I would just swipe, swipe, swipe. Or, select “Edit” and selected many items at once. This option is largely disabled in VoiceOver.
  • Clients always want some visual distinction. Having a unique design, made popular by the throngs of visually attractive but largely empty templates from theme stores are what people pay for. It’s not what you or anyone wants, however. All those webpages that have cute counters that say how many cups of coffee your team consumes a day add nothing. Giant banners and sliders add nothing.
  • I’m constantly fighting a battle where clients want to add links to external websites, PDFs, and images directly in their menus. This is bad even for sighted users.

All of these things are battles designers and developers are fighting every day. And sometimes, myself included, I just don’t have the ability to fight all of them. A teammate will ask me, “Are we really doing this?” and all I can utter is “This just isn’t a hill I’m willing to die on.”

This is frustrating because why on earth am I fighting? And then I realize what is we’re saying:

  • Remove distractions and “tricks”, like banner ads at the top of a page
  • Invest in web and language accessibility
  • Recreate material to be accessible when it otherwise isn’t
  • YouTube videos need a transcript for those who can’t hear
  • Websites need compelling and useful audits of all images, links, and behavior

There’s no easy way to make a website better or more accessible. And that costs time, money, and for a lot of clients doesn’t pay the bills. Like adding an elevator to your house. Most clients would rather us invent a “make money” button and just press that time and time again.

But there is overlap. We took a look at a website of ours and measured how many clicks it took to buy something versus competitors. We took half as many clicks. From an accessibility standpoint, that’s good for all users because everyone gets done faster.

But we do it in fewer clicks by not asking for things like phone numbers and “how did you hear about us?” at checkout. That’s a battle we have to fight but really shouldn’t have to.

There are no plugins for this that you can just install and walk away from. There is no service. It’s like going to the gym: it’s hard work that requires time, deliberate thought, and dedication.

Developers need to start bundling the costs of this kind of work into their service. And if clients don’t want to pay, then I say having nothing is better. We have to stop pumping garbage on to the web.

Developers also need to learn about what it takes to make sites accessible and test for them just as rigorously as we test mobile layouts vs. desktop ones.


Want to know when stuff like this is published?
Sign up for my email list.

Photo of Justin Harter

About JUSTIN HARTER

Justin has been around the Internet long enough to remember when people started saying “content is king”.

He has worked for some of Indiana’s largest companies, state government, taught college-level courses, and about 1.1M people see his work every year.

You’ll probably see him around Indianapolis on a bicycle.

Leave a Comment