Once at MIT a blind student was showing me his screen reader. I couldn't make sense of the sounds at all, and asked if it was in some special code. He said no, it was normal language, just really fast. He had it read the current line on his terminal (this was late '80s). I could see the words it was supposedly reading, but I still couldn't map them to the brief burst of sound that came from the screen reader.
So maybe if you've been blind a while, the audio captcha isn't so hard to decipher. Keep in mind they have to make it hard for speech recognition software to decipher too.
A few years back I attended a tech talk from T.V.Raman (of emacspeak fame) and he was making fun of us, poor non-blind users, for being so slow at deciphering the bursts of sounds from the screen reader.
He, and other blind computer scientists that were present at the talk, was so used to the screen reader that could crank its speed up to impossible speeds for non-blind users (that don't exercise their hearing sense that much).
The same applied to the tiny distortions in pitch and tone that he'd use as a replacement for syntax coloring.
Yup, my sister is blind and it's the same thing. I hear her having something read off her screen and it's just way to fast to even notice a single word. I asked her about it once and she told me that she doesn't even have it set to read as fast as most of her friends!
I used to do a lot of accessibility work, so I was somewhat proficient at JAWS and IBM's Home Page Reader.
JAWS, in particular, is quite powerful, but pretty daunting to learn. Compounding the clunkiness of JAWS is that many apps and sites aren't built with screen reader usability in mind – even if the developers made a good effort to ensure that it was possible to use their site in a screen reader, that all text was readable, etc, there's a huge difference between "possible" and "easy".
I thought it would be interesting, from an HCI perspective, to use a screen reader with something like a phone in your pocket, but sadly the learning curve is just too high, and there's not much motivation to improve it, either from the perspective of app/site developers (blind users are a rounding error) or even of the tool makers (blind users are highly motivated to learn their product, and will purchase it regardless of learning curve).
SEO, ironically, is the greatest thing to ever happen to blind users. All those people with tremendous financial incentive to make their sites spiderable to a text-only bot.
> JAWS, in particular, is quite powerful, but pretty daunting to learn. Compounding the clunkiness of JAWS is that many apps and sites aren't built with screen reader usability in mind – even if the developers made a good effort to ensure that it was possible to use their site in a screen reader, that all text was readable, etc, there's a huge difference between "possible" and "easy".
Seconded. I've done a bit of accessibility testing work with JAWS and was often frustrated by how unnecessarily difficult it was to use the software with a screen reader. You have to remember that it is probably uneconomical to make most software accessible and companies only do it because it is a requirement that software be accessible if you want to sell to certain governments (eg the USA). What this means is that development and testing is focussed on doing the bare minimum needed to pass a checklist, after which there isn't any need to do anything, regardless of whether the software is "easy" or merely "possible" to use.
When I was at MIT I had a lab partner who was blind for I think 6.002 or 6.003. I remember that one side-effect of using a screen reader was that he easily "recited" code, which I don't think most sighted programmers could do fluently.
Of course, in order to understand what he was reciting, my only hope was generally to try to type it into a text editor, format it and look at it. (And it's also hard to transcribe code being read to you.)
My mom and I used to take turns reading the code listings from magazines like Compute! and RUN to each other and typing them in. It took forever. I definitely find it harder to do that with punctuation heavy languages these days (not that I'm typing much code in from magazines anymore).
Oh, god I remember doing that for the TRS-80 (was it assembly? that's what I remember.) You'd spend hours typing in codes from a magazine, and I think my Dad and I had about a 5% success rate actually getting the program to run at the end.
Run often had BASIC programs that built machine language routines from DATA statements, just a series of numbers. At one point they changed the program so that one of the bytes in each DATA statement was a checksum, and it would tell you if the rest of the line was accurate or not. That helped a lot.
My mom likes to say "I was just having fun, typing in the games and playing them, but you were actually learning something."
That was my first introduction to the notion of checksums and parity bits too. Very practical, in your face, real world consequences kind of impact. Excellent way to learn the concept. I don't miss having to type in all my software by hand, but the sort of forced physical learning and in-your-face impact of syntax and data patterns was really effective and something we should miss.
This is a classic example of how the culture of home parenting, and individual choices made by the parent, has a big impact on kids. Having your mom, of all people, reading code listings from a software magazine, typing them into a computer with you, pretty much puts you into the top 1% percentile in terms of parental engagement in their child's intellectual development. Race itself is irrelevant compared to this. Nationality itself is irrelevant to this. What the parents do: extremely relevant.
True, but race and nationality are pretty relevant as to whether you have a mom who has convenient access to software magazines and computers to type the code into.
I don't think mkramlich was meaning to imply otherwise, because the relevancy of parents' involvement doesn't just apply to technology. Engaged parents who involve their kids in engine rebuilding or animal husbandry or child rearing or anything else that requires years, effort, and guidance to master are most likely are in the top 1% in terms of intellectual development.
Technology and computers wasn't my mom's hobby or profession (she's actually a seamstress -- and I learned the basics of sewing from her) so it wasn't like she was introducing me to her interests; however, when my son gets older (out of diapers), I plan on spending time with him teaching about my major interests and learning new things with him that's he's interested in, independent of what that may be.
So maybe if you've been blind a while, the audio captcha isn't so hard to decipher. Keep in mind they have to make it hard for speech recognition software to decipher too.