Since I’ve been using blocks-based languages lately, I’ve been thinking more
about the challenges of using blocks-based languages, and programming and
learning CS more generally, when legally blind. One of our PhD students
in the Human-Centered Computing PhD program is legally blind, and he generously
came to visit me and brought with him one of his students who is legally blind
and learning programming.
The first and biggest surprise for me was that most (about 85%) legally
blind people can actually see. One of the people I worked with can see
light/dark (which doesn’t help with programming, but does help him with
way-finding and spatial navigation). The other one loves to program in App
Inventor using high magnification on her Mac. She’s low-vision and finds the
large splotches of color useful in figuring out her code.
The implication, they explained to me, is that some tactile-based affordances for blind people don’t work because low-vision blind people would prefer to use audio and what sight they have, rather than learn a touch-based encoding. I was surprised to learn that most blind people don’t learn Braille because it’s a complicated code, and low vision people would rather magnify the screen than learn the encoding.
Blind programmers who know Braille will often use an audio screen reader along with a Braille reader for a single line of text. It’s easier to scan a line (especially for syntax errors) with Braille than with a screen reader.
The second surprise was about their tools. They showed me Visual Studio and EdSharp, a plain text editor developed by a blind programmer for blind programmers. I asked what features made an editor good for blind programmers. They said, “It works with screen readers.” And really, that’s it. They don’t want specialized tools with non-standard interfaces because of the cognitive load of switching between the standard screen reader interfaces and a novel interface.
I didn’t realize how few tools go to the trouble of accessing the screen
reader API’s and providing good mappings from the interface to text. Processing
(all platforms) and NetBeans (on Windows) are completely unusable for blind
people because they are inaccessible by screen readers. Visual Studio has
become a new favorite IDE, not because of any special features, but because it
does “it doesn’t crash and I can access it with a screen reader.”
I was particularly interested in the low-vision programmer’s use of App Inventor. We talked about what didn’t work for her and brainstormed what would make it better. One of the tougher parts of block-based languages is that scripts could be anywhere in a 2-D space. It’s hard to scan a 2-D space with a zoomed interface, and there’s no obvious interface for screen-readers. Having blocks snap to a grid would help a lot to make it easier to find scripts for both types of blind programmers.
when the whole class breaks into small group discussions, they can’t hear their group. The definition of the group is by physical proximity, but they discern “close” by “loud.” They end up listening in to whichever group is loudest around them. They need a different kind of active learning activity.
Click here for Computer Science Academic
Tutors
The implication, they explained to me, is that some tactile-based affordances for blind people don’t work because low-vision blind people would prefer to use audio and what sight they have, rather than learn a touch-based encoding. I was surprised to learn that most blind people don’t learn Braille because it’s a complicated code, and low vision people would rather magnify the screen than learn the encoding.
Blind programmers who know Braille will often use an audio screen reader along with a Braille reader for a single line of text. It’s easier to scan a line (especially for syntax errors) with Braille than with a screen reader.
The second surprise was about their tools. They showed me Visual Studio and EdSharp, a plain text editor developed by a blind programmer for blind programmers. I asked what features made an editor good for blind programmers. They said, “It works with screen readers.” And really, that’s it. They don’t want specialized tools with non-standard interfaces because of the cognitive load of switching between the standard screen reader interfaces and a novel interface.
Click here for Computer Science Academic
Tutors
I was particularly interested in the low-vision programmer’s use of App Inventor. We talked about what didn’t work for her and brainstormed what would make it better. One of the tougher parts of block-based languages is that scripts could be anywhere in a 2-D space. It’s hard to scan a 2-D space with a zoomed interface, and there’s no obvious interface for screen-readers. Having blocks snap to a grid would help a lot to make it easier to find scripts for both types of blind programmers.
We talked about how CS classes might be better designed for legally
blind students. I was surprised to learn how much they dislike active
learning activities in classrooms. They said that
Click here for Computer Science Academic
Tutors
when the whole class breaks into small group discussions, they can’t hear their group. The definition of the group is by physical proximity, but they discern “close” by “loud.” They end up listening in to whichever group is loudest around them. They need a different kind of active learning activity.
No comments:
Post a Comment