![](https://lh3.googleusercontent.com/blogger_img_proxy/AEn0k_useBGBmJBKVFmTcObaii3CTQbub4mH9IHommK-HZzsytipIvHLk6K19jAE3rZ8uItEBdeBHtlwOZ9E9vs-wLMz7Yz_85_sLdLNMHehAFqsl8WIE5_nIodRr3-_YUTCuv25mf6EhhyrUasVcmP39eqko8hp8n_RDs_oqXiJSIrWCQRQstdcMUHoaw=s0-d)
Sam Spratt / Gizmodo
By John Herrman
Gizmodo
Your eyes are absorbing this Web page. They're passing over this, this, then
this word, right now. That's how reading works, online: You take this for granted. But what if you couldn't?
We grant our gaze to electronic screens for most of the day, and in return, they give us anything we want. We stare; they glow. We rarely speak, and neither do they.
And this makes sense! The Internet is a boundless collection of text, images and video, channeled to flat pieces of glass and plastic, beamed through lens, retina and nerve, all the way into our brains. It can
show us anything, and for most Web users, that's exactly what it does.
But for millions of others — those who are unable to see — the Web is a wildly different place. Characters become sounds. Layouts are meaningless. Images are, at best, words and, at worst, blank spaces. And yet the blind browse the same Internet as everyone else, every day. They use the same gadgets the sighted do, and happily. But how?
The sightless InternetThe most common way for the vision-impaired user to access the Internet is with a traditional browser and text-to-speech software. You're probably already vaguely familiar with some of it — Windows users will have come across Microsoft Narrator, and I defy you to find a single Mac OS user who hasn't forced VoiceOver to hurl insults at his friends. These are the tools — or tools like these — that millions of people depend on to access the Internet.
But to say that blind users just "hear" the Internet is a gross oversimplification. It's not just text and images that blind users miss, it's virtually every part of the fundamental browsing experience.
![](https://lh3.googleusercontent.com/blogger_img_proxy/AEn0k_uVVM1-YyNFZn41mkb-4lx4m0ZfDIkyf-P6mg6vYspb-mQYPeNcngKNnk1iI6xjVLvdUMwjGMcCy6BdmqFndeTgEA638qhJxzkmFqkuXf8JrMLu0vRl9B7jYS5qyqUJAqRF0-TMAUBXX26KZT9g_Rgy=s0-d)
Millions of people depend on tools such as the Mac OS X VoiceOver to access the Internet.
Here, try this: Stop reading for a moment. Lean back and survey this page. Now think about what you do when you visit a news site. Your eyes are probably drawn to the stories listed across the top of the page. They look important, right? Why else would they be up there? Further down you'll see the site's banner, but you probably don't spend much time looking at that, and your eyes dart to the list of stories in the middle of the page. You scroll down, glancing at pictures then headlines, or perhaps headlines then pictures. The margins of the site are either full of ads or static information, so you probably don't pay them much mind.
Your habits aren't just sight-dependent (obviously), they're pretty weird. Your eyes fly around, sometimes randomly and sometimes in response to cues onscreen. You hunt for links and cherry-pick from galleries. The word you're looking for catches your eye, so you click it. Consciously or subconsciously, you usually know where to look.
With a screen reader, there is no "looking." It's a simple parser, and it starts at the top. It combs through a website a lot like a Web browser combs through HTML, except instead of rendering an IMG tag as an image, or an EM tag as italicized text, it converts them to sounds: a readout of the image description — the alt text — and a changed audio inflection, respectively.
Then, of course, there's all that text. On a visually rendered Web page, it lives in blocks and columns. If you're lucky, these blocks and columns will be organized in a logical or familiar way. They'll be laid out, basically. But that's such a visual concept. What happens when a layout becomes words?
"Screen-reading software presents the Web page as a set of lines and links, and possibly other things — frames and headers, if the software employs that." That's Paul Schroeder, vice president of Programs and Policy for the
American Foundation for the Blind.
Vision-impaired himself, he uses screen reading software for daily browsing. "When you log onto a website using screen-reading software, what you start with is a site that tells you how many lines, and some basic structure — but not very much. When you're experiencing a cluttered site, the information you want may be 300-400 lines in, and if you're going line by line, or section by section, it can take you a very long time to find what you want."
Think about that: The Internet is anything but linear—website code is nested and cryptic, and often looks jumbled and out of order. (Right click, view source! Oh, yikes, maybe don't.) Websites often have multiple visual directions, or sometimes none at all. Yet audio screen readers — and Braille modules, which display about one line of text at a time — have to render them in sequence, somehow. And listeners have to make sense of it, to develop some kind of intuition for a site's layout and structure based on very, very small amounts of information, all out of order.
![](https://lh3.googleusercontent.com/blogger_img_proxy/AEn0k_sFvfpYFan8zLUYGkxA1EwxJ-mUClTRY9yyS92ENeVeJyUjYTedqgi0Zo-r8wFvquNsWcKjHKMG-2rbSeBY_V3oqvI0pKX_AEfh3i6dkz9AIVZD2NPN4j0yvZ-nBNFuRXyDaYFmg4Z46-ioCyDEmYfO=s0-d)
The SyncBraille is touted as one of the smallest, lightest and most affordable ortable Braille displays in the world.
Of course, there are tricks. Screen reading software, like VoiceOver in OS X or JAWS for Windows, is more clever than I've made it sound. It parses websites for headers, and sometimes navigational elements. It can give you a literal description of a page's layout — "three columns, two rows" — and its surprisingly unrobotic voices reflect all kinds of punctuation. It even differentiates between outwardly identical tags. My editor actually just sent us an email to this effect:
Stop using < EM > and < I > tags interchangeably. One is for italics, and one is for emphasis. It's a difference you can't see, but it's a difference some will hear.
These are the small features that make spoken Web pages usable, but they can't be taken for granted: People who design websites have to be vigilant about including headers to divide large blocks of text, to include alternative text for images, and to use their tags properly. Problem is, a whole ton of sites often don't. Ever had — or overheard — a tedious argument about whether or not a site is "standards compliant", as in
W3C, HTML compliant? Well, this is like that. Actually, this
is that. The W3C defines standards for accessibility just like they define standards for the
rest of the Web. But like those other standards, they're often disregarded.
And even a totally compliant website can be overly complicated, or simply too liquid. "Facebook is a good example, because it's an ever-changing environment," says Schroeder. "Some users master particular aspects of Facebook, find that the programming has changed, and have to rethink their strategy."
But again, there are tricks: "Vision-impaired users who frequent Facebook and similar sites do one of two things: They either use the mobile version of the site, which is less cluttered, or they simply engage the specific thing they want to do and remember the specific things they want."
![](https://lh3.googleusercontent.com/blogger_img_proxy/AEn0k_vwP9lyFRgpeCU0XemN1QCUgeiUaGqlYx-oamsFp1i389qDANmpMbhMVyuAmscN3v8b_ks1rdw8jSosvX6Zq9zdwKdGnrUVGCLlJz8cB1aLH907zMi1uYiWXdPInhFLulCbLFj2pscq4_iPzD-ofM5P=s0-d)
This is the VoiceOver menu for the iPad.
Properly coded websites, intelligent software and handy shortcuts and tricks mesh together to make browsing the Web tolerable for the vision impaired. Skills and persistence play a large part too. Schroeder tells me that in some cases, blind users can hop through site headers and run searches so quickly that they may be more efficient than sighted users in some situations.
But pending
legislation could leave us with a much broader interpretation of the American With Disabilities Act, which could mandate certain commercial websites to do those little things that make screen-reading easier. But it's a constant struggle, with technologies often outpacing the tools necessary to parse them. Oh, and I almost forgot, the Web is dead. Or something.
Gadgets and appsIn case you missed the
Wired cover story entitled "The Web Is Dead,"
here it is. The gist, to brutally oversimplify the piece, is that the Web as we know it, this familiar hodgepodge of websites rendered in browsers — you know, the W3C's standards-based Web — is falling out of vogue, making way for the new Internet: the internet of apps.
I don't totally buy it, but that's not the point. Apps are everywhere, and so are the devices that run them. I read as much on my mobile devices as I do on my laptop, if not more. So if the future runs on an iPad, what does that mean for the guy who can't see?
It's really a two-part question, so let's start with the fun half. The rise of the touchscreen gadgets, flat, featureless panels they are, is actually great news for blind folks. Let me put that another way: If you're unable to see, the iPhone, with its virtual buttons and complete lack of tactile feedback, is actually easier to use than, say, a BlackBerry, with its dozens of buttons. Weird! Well, not really.
Part of the story here is software. iPhones (and now Android phones) have sophisticated text-to-speech functionality, without which they'd be useless to the vision-impaired. BlackBerry phones, on the other hand, basically don't.
But even if RIM released an update to all their button-based phones giving them flawless screen-reading abilities, they couldn't measure up to a touchscreen device.
When you use a BlackBerry (or a Mac, or a PC) your sense of place is defined by sight. You move with a cursor, or a highlighted menu item. Then you click. And for the same reason Web layouts aren't very helpful to a blind person, the cursor paradigm — hell, the whole button-input paradigm — sucks. With a touchscreen, though, your fingers provide your sense of place. iPhone users can turn on the VoiceOver function, tap anywhere, and hear a narration of what's happening. Tap the upper left section of your screen, right near the volume switches, and a voice might read, "Camera app." Tap the bottom left, and you'll hear "Phone." With buttons, mice and keyboards, you're stuck back in that slow, linear screen-reading world. With touchscreens, a screen, and a piece of software, can actually be surveyed. Memorized. Used.
So that's pretty neat. But it's a rosy take. Asked about smart phones, Schroeder painted a glum picture: Apple and Google may be doing this stuff right, and building solid text-to-speech into their operating systems, but other companies are lagging. And anyway, text-to-speech in an OS is great, but today's smartphones are all about apps, developed by thousands of people in thousands of configurations. On the iPhone, for example, some apps work perfectly with VoiceOver. Plenty more don't.
Messy as it is, the capital "W" Web seems to be inching closer to universal accessibility. It has a guidebook, at minimum. But all these apps, and all their stores, may be setting progress back a few years. Suddenly, blind users' experience is at the mercy of each individual app developer, or with any luck, companies that provide their tools, and grant them access to their app stores. It's not an insurmountable problem, but it's a problem.
In any case, whether you're an app developer, Web designer or just a dude who likes to update his blog every once in a while, remember that someone, somewhere, might be listening to what you've written. And that alt texts in images aren't just for
jokes. And that it's still OK to force your computer to recite profanities to you friends, for kicks.
Published via:
http://technolog.msnbc.msn.com/_news/2011/04/01/6390761-how-blind-people-see-the-internet