GLEN GORDON: Hello everybody. Glen Gordon here. But no, I’m not welcoming
you to FSCast because I’m not hosting this month. Oleg Shevkun is filling in
as guest host. He has a great episode planned for you. And I’ll be back with
FSCast near the end of May.
Hello, and welcome to FSCast for April 2023. I’m Oleg Shevkun, guest-hosting this episode. I’ve been a JAWS user since version 2.0, which came out in the fall of 1996. And for many years I have been a beta tester for JAWS, and actually prior to that even for Blazie products like the Braille ‘n Speak and the Braille Lite. And I’m pretty sure there are many among our listeners who still remember those products. Maybe you still have an old Braille ‘n Speak somewhere in the closet. I certainly do.
Now, originally I am from Russia, but now I work with Vispero in software testing and live in Germany. And once again, I’m really delighted to be sharing this podcast with you, and discussing some of the issues that I think you might find interesting. And by the way, thanks to Glen Gordon for giving me this opportunity. Now, at the end of our March episode, Glen mentioned that he’s got a number of other things to do here at Vispero. And I can easily guess a couple of those things. One of them is working on new product features, and the other one is fixing bugs.
And talking about bugs and product enhancements, in April we released our updates for Fusion, ZoomText, and JAWS. And there’s some really exciting stuff there. You remember that back in February, in our February update, we added support for the WhatsApp messenger, and we’ve been getting some very positive feedback regarding that support. However, there are a few things that needed to be improved, and these improvements have made their way into our April update. For example, if you’re in a conversation on WhatsApp, and your buddy is typing, then JAWS is going to be continuously playing the typing sound to let you know that your buddy has not gone off to lunch or something like that. And that allows for a smoother conversation.
There are other improvements in WhatsApp. There are improvements in Gmail support, and actually across the entire Google suite of applications. That’s Google Docs, Google Sheets and so on. And for Microsoft Office users, there are also some improvements for our support in Word and Excel. So basically we’ve got some nice stuff going there, and in ZoomText and in Fusion, as well. So if you haven’t downloaded those updates, please feel free to go to freedomscientific.com, read up on the new enhancements, download the updates, and let us know what you think about that.
OLEG: Now, one of the features that we released in ZoomText and Fusion back in February is called True Center Tracking. And that feature has made quite a lot of splash among low-vision users. As a matter of fact, just in a few minutes we’re going to be hearing from an AT trainer who is really impressed by this new capability. However, I think it might be a good idea to explain what it is all about. In fact, we’ve had center tracking for quite a while now. So your cursor or your mouse pointer or the interface item you’re working with could always be in the center of the screen, if you chose to set it up so.
Now, however, before February 2023, there used to be one exception. If you are moving your mouse or your cursor to the edge of the screen, ZoomText basically did not know how to handle this. And even though you would prefer to have your focused item in the center of the screen, it really could not stay there. So if you got to the edge of the screen, that’s where your mouse pointer or your cursor would still go. Now, True Center Tracking is a new option that changes that. With this option enabled, your mouse pointer or whatever interface item you’re working with, or whatever spot the cursor is in, will always stay in the center of the screen.
But what happens if you move your item to an edge of the screen? Well, visually, that will still remain in the center. And everything outside the actual screen area will be filled with the color of your choice on your monitor. So it’s basically like you’re moving around the screen. You’re getting close to the edge. Whatever you are viewing stays in the center; and the other area, the other part of the screen, is filled with the color of your choice. I personally like that to be something dark blue, but it’s up to you to experiment.
Now, I also have some personal experience with that feature. Normally I’m a JAWS user, but sometimes I like to have things magnified. The problem for me is, whenever I use screen magnification for more than a couple of minutes, I get eye fatigue because of things always moving on the screen. I would really like to have that central point on the screen where I can focus no matter what. And that point isn’t going anywhere.
So guess what? With our February update, I tried True Center Tracking, and it makes a lot of difference. Even though screen magnification is still never going to replace speech or braille for me, it is now a viable option to see a character or see a picture or explore something, and to know that whatever is under the mouse pointer or whatever is under the cursor is always going to stay in the center of the screen. And we’re going to be hearing more on that in the next section.
OLEG: On our FSCast, we often talk about JAWS and other blindness-specific products. But Vispero has a number of products for visually impaired people who are not blind, for people with various degrees of vision limitations. And that sometimes presents a challenge, and the challenge is how do you train a person in using those products? How do you even introduce a person to these products? And how do you progress, if there is a need, from low vision to blindness products? We’re talking today to Mandy Van Cleave. Mandy, welcome to the show, and great to be talking with you about those things.
MANDY VAN CLEAVE: Thank you.
OLEG: So Mandy, could you please say a few words about yourself to our listeners, who you are and what you do, so that they know that they’re talking to an expert?
MANDY: Well, prior to becoming an assistive technology user and trainer, I did work in information technology, which was basically PC and server maintenance and support. And you know what? Those are great skills to have in this work. I started in April 2011 at Cincinnati Association for the Blind and Visually Impaired as an access technology specialist, working with individuals with varying degrees of vision loss and sometimes physical challenges, as well.
OLEG: If I might ask, was it somehow related to your own vision loss or your own life experience?
MANDY: Not at the time. I actually didn’t really find out about that until about nine months into the job, that I have a progressive cornea condition that would affect my vision slowly and gradually.
OLEG: Was that a shock, or did your job actually prepare you for news like that in your own life?
MANDY: Exactly. It did. I think that my reaction, had it been prior to this job, would have been completely different. And I saw how much independence a person with a visual impairment could have by being around the work. I have had a great head start at learning technology that I may need in the future. Not sure when I’ll need it, but it’s there. And I feel that I’ve learned enough that if I wake up tomorrow with a lot less vision, I would already know how to use my devices.
OLEG: Do you think there is a gap or more of a continuum between how a computer or a mobile phone is used by a blind person or a vision-impaired person or a sighted person? I mean, sometimes we say, yeah, blind people use shortcuts, sighted people mostly don’t use shortcuts, although that can be debated. Is there like a bridge, or do you have to jump from being sighted, to being low vision, into being blind?
MANDY: That would per individual probably depend on their condition. Somebody loses a lot of vision suddenly. There is a big gap. I think some things are harder to do on the mobile devices, and you’re doing things a lot differently than a user on the smartphones without any vision problems at all. So just things that are a little bit harder to do on the smartphones and tablets. Whereas computer, I knew quite a few helpful Windows keyboard shortcuts before really needing to use them on a productivity level. So I could gradually ease into that. But most individuals, they’re working with the vision that they have. They’re not necessarily going to know what to do if they have to stop using the mouse. So that’s where we come in as trainers to bridge that gap.
OLEG: Suppose I come to see you as a trainer, and I say, yes, I am a Windows user. I like my Windows computer. But Windows has Magnifier and Magnifier can magnify pretty much. And actually, I don’t even need to use Magnifier. I can go into my font settings. I can make the fonts larger, update my Windows theme and so on. So where do you disagree with that, or where would you direct me to another way of doing things? Because I assume many people will come with ideas like that.
MANDY: Oftentimes, it’s not that the individual comes with all this knowledge, that they’ve used many options before. I might find that somebody has set their resolution lower to enlarge things across the board. And that’s not really ideal. A lot of things are cut off that way.
OLEG: That’s probably the worst thing to do with the resolution. You’re not going to see half of your screen, actually. And you’re going to be wondering why the interfaces are all so distorted.
MANDY: You might be surprised that that’s the go-to option that whoever helps the person out with that particular problem, that’s what they know how to do. They weren’t even aware of the...
OLEG: So part of the advice would be do not change the resolution to make things larger on the screen. Will you agree with that?
MANDY: I would because most of the time it’s degrading the quality of the image. Besides low resolution, I mean, that’s the most common option. It’s one that’s very easy for another person who’s not as well-versed on the ease of access and accessibility settings to use. Or some individuals will use their handheld magnifiers on the screen. I definitely let them know there’s a better way. Those are the individuals who, once I do show them options on their computer, they might like the lens view instead of full screen view within Magnifier or ZoomText.
OLEG: How do you make the jump from built-in Windows tools to specialized applications like ZoomText or Fusion?
MANDY: Many users feel the built-in options are just not customizable enough for their preferences. And I can give you two examples of some of my favorite features to show individuals with that dilemma in ZoomText are the color enhancements. That’s usually the one that is the bridge from using the built-in Windows options to possibly let’s – let’s graduate up to ZoomText and try out the color enhancements like inverted brightness or two-color effect, custom color enhancement in ZoomText and Fusion. This allowed one user who he knew that he could see green text on a pink background the best. And this allowed him to use his computer visually. Those are contrasting colors for him, and it’s the best two that he could differentiate from other colors.
And then another time that comes in is I don’t know if you’ve ever tried Windows Magnifier with multiple monitors, but that’s usually the deal breaker where an individual will say, “Oh, well, ZoomText has this multiple monitor support.” Well, I let them know about it if they don’t know. And that’s usually a good deciding factor for an individual is the need to use multiple monitors and have more customization over their preferences.
OLEG: With color enhancements and color replacements, I understand you don’t even realize how beneficial that may be to you until you actually try it. So do you actually show the features of the software, or do you let people or encourage people to explore the software? Where is the balance between guiding a person or, on the other hand, letting them explore?
MANDY: Well, when we’re just starting out and evaluating their needs for what software and what hardware will be most helpful for them to adapt their technology to use it based on their individual needs, that is when I will demonstrate the features that I think will be helpful to them, especially when in comparison to the built-in Windows features or other options that are out there that they may have heard of. Or typically they haven’t heard of anything. So they don’t know what they don’t know and what they need to learn. So, and that does, that exploration that you mentioned, that comes with time. They’ll slowly poke around in the settings. And some people like to set it and forget it, and they just use their larger pointer, their magnification level, color set the way they like it. And then there’s not much that they like to change after that. But there are plenty of users who get in and poke around and kick the tires.
OLEG: I want to turn for a moment to somewhat of an emotional aspect. And let me preface this question by saying that I come from Russia, and maybe what I’m going to describe is not found in the States. I mean, maybe that’s a typical Russian phenomenon. Many people, at least in the country where I’m from, will say, okay, I’m losing my eyesight, but I’m not getting blind. Yeah, you can show me the equipment and the software all you want, but I’m going to try to hold to the way I used to live, including using the computer and the smartphone the old way, the sighted way. Is that typically Russian, or do you see things like that in your own work, in your own practice, in your own experience? And if you do, how do you deal with that?
MANDY: Oh, absolutely, that’s – I think that happens everywhere. And we like to compare that to the stages of any kind of loss where usually that first stage is denial. And then there may be grief, and then eventually there is the acceptance. That’s international, I believe. Usually a vision fluctuation will take care of that attitude fairly quickly. I won’t say it’s an attitude; but, I mean, at the beginning it is. It’ll definitely cause a person to refuse help until they absolutely need it. And at times they often do say they wish they would have sought out help sooner so that they had had more time to adjust.
And it really requires early intervention of their eye care professionals to let them be aware of the facts of what they need to expect with their vision condition and making those appropriate referrals to agencies. I mean, there have been doctors in the past that waited until they could no longer help the individual. But with earlier intervention, they could have slowly and gradually eased into, you know, without hitting you all at once, all of the training that you would need. So we try to intervene earlier as much as possible.
Now, sometimes it’s too early. This is especially true with individuals who get diagnosed because another family member got diagnosed through DNA, and they found out it was a hereditary condition. And maybe they’re not at the age where that would affect them yet. But I think that they, if they get the opportunity to slowly and gradually ease into their vision loss, they usually have an easier time navigating all the different areas of their lives that they need help in.
OLEG: Well, suppose you have a client who is losing their eyesight, who is losing their vision, and you need to make the next connection. They’re under lots of stress already. How can you just come in and tell them, hey, you know what, you’ve got to memorize 20, 30, 40 more shortcut keys or quick keys. And how do you make that new weight of new ways of working with a computer manageable?
MANDY: Most individuals are motivated in some way or another based on the activities that they want to continue to do using their computer, whether that’s using social media, communicating with family, doing work for a job or for school. There’s plenty of reasons that motivate a user to learn what they need to do. And that’s another thing that can be done over time. They don’t always jump into the hard stuff right away. In fact, we usually start out with more simple tasks and work our way up, depending on the individual’s computer knowledge.
OLEG: You’re not going to overwhelm them.
MANDY: Right. I work from individuals anywhere from age six to 96. And it depends on their comfort level with technology to begin with in a lot of times. And you could think about how much do they rely on the speech. So an individual that might need some light magnification, they’re usually not interested in using speech. But I still like to make those users aware that these options exist. A lot of times this will ease their mind that, if their vision does worsen, they would still be able to continue to use their computer.
You know, most low-vision individuals are very surprised to learn that a person with no usable vision could still use the computer. And maybe that’s something they were thinking about. And a lot of people, by the time they get referred for services, their email has piled up. They’re a little bit or a lot behind on checking email. That’s usually one of the first things that they sort of put on the back burner. We also look at, once a user is past those light options, and maybe they’re zooming in quite a bit, especially if you think about over 8 to 10x, they’re missing out on a lot of information on the screen. And that speech could be very helpful. That’s usually when we start to think about screen readers, JAWS to add to the mix. When magnification isn’t enough for the individual, there’s oftentimes where text is still going to be blurry to a person, but they can see the outline of text. They know it’s there. They could use options like mouse echo.
OLEG: Yeah, you can basically point with a mouse, and it reads it out to you, which gives you a transition from your eyesight to speech use.
MANDY: And then you have your individuals who they’ll get eye fatigue from reading and writing, especially when that’s really lengthy information. So they can either use speech at the end of the day when their eyes are starting to feel tired, or one thing that I like to suggest users do is use speech throughout the day, and that will offset that eye strain and fatigue. And I even had a user comment that that helped her save some of her vision to watch TV in the evening.
So there’s so many varying stages of vision that we never know what we’re going to be working with or what the individual needs are. But we assess that on an individual basis. And users have a lot of preferences of whether they think they like speech or whether they think speech will be helpful. But then if there’s a greater degree of vision loss, they most of the time understand that that’s the best choice is to just go straight to the speech. A lot of people know that there’s not enough information in their vision to use the computer visually. So they’re also ready to go straight to those screen reader options with the higher degrees of vision loss.
OLEG: Now, for a sighted person who is looking at the blind person using a computer, what is the main thing they need to understand in order to be efficient in helping that person? Or in assisting that person? Or in working together with that person?
MANDY: The most important thing for them to understand is that what they see on the screen is not the same as the individual with the vision impairment sees on the screen. In our low vision services, they have vision simulator goggles. And they can actually have that opportunity to see differently through the eyes of the same condition as their family member. And I think it is helpful to see even pictures online where there’s a full picture, and then next to that’s a real splotchy picture that represents certain vision conditions.
So I think it’s very important for anybody interacting with a person with a visual impairment using the computer to understand that part, where it’s this is a different way to use the computer based on the individual’s needs that the sighted person typically cannot relate to. It helps for that individual if they could understand the way that the vision condition affects the individual’s view of the computer. And it’s oftentimes that an employer doesn’t understand those differences. But if they watch that individual use the computer, they could skim the whole screen at once. But that may not be true for the employee trying to use different adaptations.
OLEG: And finally, Mandy, you started 12 years ago. Do you see much of a difference? Do you see much of a development, not just technologically, but in terms of social awareness between when you started and where you are now? Have your clients changed? Have the conditions changed? Or is it like, hey, we’d like them to change, but that’s pretty much the same. So 2011, 2023, your job, your clients, availability of services, awareness. How would that compare?
MANDY: The best way I break that down is I compare the first six years to the most recent six years and what has developed. And, you know, the first so many years of my career in this field was same, same options, same software, same hardware. Not much was changing around that time. And then, bam, high-definition CCTV video magnifiers came out. And then we started getting more and more features built into the smart devices, and Windows and Mac, and more and more devices coming along. It was really great to have all these options, just started to explode about this halfway point from then until now. I just think the technology has exploded. There’s a lot more to keep up with. We’re getting more and more new features like that True Center Tracking.
The way I approach a new feature, I read about it, think about what users that I’ve worked with in the past might be interested in learning about this. And some of my first impressions of the True Center Tracking, I was like, this will be great for a person with – who maybe has peripheral vision loss, and maybe they only have their central vision. But the more I explored it, the more I realized, oh, this would actually be just as helpful for a person with some central vision loss because then, with True Center Tracking, they can just actually use a little bit of their peripheral vision to see the screen, and they can keep it in the same spot. They don’t have to search the screen as much.
I noticed that I think people with bigger monitors before the option of True Center Tracking was around, there was some information outside of their view in the corners that they couldn’t bring to the center until now. I tried the different alignments. I tried mouse pointer alignment with edge and centered. It’s something I’ve noticed over the years about mouse pointer alignment, the features that we’ve had all this time within edge margins, and then the option for centered within the zoom window. Those options have been around for a while.
And what it usually comes down to with the mouse pointer alignment is whether that makes an individual dizzy, or it gives them motion sickness. And what I have noticed is that, if centered gives a person a little bit of motion sickness, then the edge option does not. The reverse is true, too, so that if edge makes a person dizzy, centered does not. So that has been an option that I have worked with for quite some time. And it’s even better with True Center Tracking enabled. So when my cursor is toward the edges of the screen, it’s all the way to the left of my monitor up at the top corner where that may be out of a person’s view. They like having a large monitor, but they couldn’t always get that in the center. But now they can. So I also tried it just separately with text cursor alignment. And you notice that the cursor is only centered where it can be and not near the side edges of the screen. So I just really enjoyed keeping that in the center so that I could just keep my focus directly at the center of my monitor.
And I have to say, wow, wow about the control and menu alignment with True Center Tracking enabled. There’s a subtle difference if you try it centered without True Center Tracking. You might notice this difference and then enable True Center Tracking. And when you do the ALT+TAB, TAB, TAB to cycle through the applications you’re running to switch applications, that was wonderful. The center of every thumbnail was at the center of my screen as I rotated through those applications. Then I turned them all on and tried that out. And just having everything in the center was great for bringing everything directly to the center of the view like we haven’t been able to do before. And I am just amazed with this new feature.
OLEG: So thank you, Mandy. Thank you for your time. It’s really, really been a good experience hearing from your perspective, especially how you help people to deal with the reality of sight loss and not just to be informed, but also to be equipped. Keep doing this good job.
MANDY: Thanks.
OLEG: I want to follow up on something that we discussed with Mandy Van Cleve earlier in this interview. And it is this whole issue of blind people working in an office setting with sighted colleagues; or blind people interacting with sighted family members, friends, relatives and so on. And quite often there is a problem of communication. We can be listening to our computer, or we can be reading our Braille display, and we cannot really explain where on the screen that portion of text is located, what exactly we’re reading, what exactly we’re getting. And hence our sighted colleagues, our sighted friends are having a problem figuring out where we are or what we’re doing.
There are a couple of features in JAWS that can help with this issue. Now, granted, they were initially designed for a slightly different purpose. They made their way to JAWS from ZoomText or from Magic, which was our earlier screen magnification product. And they were designed to help low-vision users. But actually this is an excellent communication tool whenever we work alongside sighted users. Now, one of those features is Text Viewer. And there are two very similar viewers in JAWS. One of them is Braille Viewer, and the other one is Text Viewer. And the command to bring up those viewers is INSERT+SPACE+B, or rather JAWS Key+SPACE+B. That’s a layered keystroke for Braille and Text Viewer.
So you start with INSERT+SPACE+B. and then you press T for Text Viewer. What happens then is you get a window at the top of the screen where JAWS is actually displaying the information that is currently focused. It’s not exactly a repeat of your speech. So, for example, if you do INSERT+F12 to find out current time, that’s not going to be displayed on the screen. However, when you are in a document, and you are reading something, or you’re editing something, or you’re going through some controls, JAWS will display in that Text Viewer whatever is in focus or whatever is under the cursor.
Now, you can set it to your liking. For example, you can change the font size. You can change the colors. And by the way, if you change the font size, then that Text Viewer window is also going to get larger. And you make those changes in Settings Center. So you press INSERT+6 for Settings Center. Then you go to Braille and Text Viewers. And there are a couple of subnodes there. There’s a general subnode where you set your general settings, things like font size or font color and so on. And then there are separate settings for Braille Viewer and Text Viewer. Experiment with these to give you the best experience. And if you’re working alongside sighted colleagues, ask them what would work best for them because that’s, as I mentioned earlier, a communication tool.
Now, to disable Text Viewer, you press INSERT+SPACE+B T once again. So that’s JAWS Key+SPACE+B T. But there is another dimension to this whole issue of working together with sighted colleagues. And that comes up whenever we are in a virtual cursor environment, such as on a website or in an email message or a PDF document. And again, the question is the same. Where exactly are we reading, what exactly are we interacting with, and where that is on the screen.
And by the way, if you have some residual vision, you might be asking the same question, as well. So you see something on the screen, but it would be nice for you to have an idea of a visual layout of the page. And again, there’s a feature that’ll be helpful, and that’s called Visual Tracking. So to get this up and running, you go to Settings Center. And actually, I like to activate this on a per browser basis, such as for Microsoft Edge or for Chrome. And so in Settings Center you search for Visual Tracking. And there are a few options. You can choose the color, or you can choose the type of your tracking. And that’s going to be to track and show on the screen exactly where your virtual cursor is.
If you’re working with sighted individuals, you might choose a color and type of tracking that is unobtrusive. They will be able to work with this, and you don’t really need to worry about it. If you’re setting it up for yourself, however, you might want to choose something more noticeable. Like, for example, I like to make it a block, and I choose red for the color. And that way, whenever I interact with a web page, and I’m wondering about the layout of the page, even my low eyesight allows me to take a quick glance at the screen and see that red block, that highlight, and know exactly where that menu item or that heading or that links list or whatever is positioned on the page. So just two features. They started their life as low-vision features, but then it turns out that they are quite useful to blind individuals, as well. Once again, one of those features is Text Viewer, and the other one is Visual Tracking.
OLEG: Well, in the few minutes that remain, I’d like to talk to you about bugs. Yeah, that doesn’t seem quite exciting. Normally when we release new software versions we love to talk about new features, and we boast about enhancements, improvements, and so on. But sometimes you just have to deal with bugs, and also each of our new releases includes a bunch of bug fixes. On the other hand, sometimes we can get into a situation when we, as users, have something about the software that is bugging us, that is really preventing us from working in our normal way, and we don’t know how to report it. Or else we report this to the developers, and the developers say they cannot reproduce it, or they will say that the report is too general and so on.
Now, I’ve been in that situation as a user of Vispero products and many other software and hardware products, and I’ve been on the other end of the situation, as well. I work now in testing here at Vispero, and we receive lots of bug reports. Now, here’s a question. What is it that makes a bug report special? Why is it that some bug reports get action pretty quickly, while others are being set aside with no immediate action? Well, let me share with you a few pointers on how to write a good bug report. I cannot promise that the bug you report will definitely be fixed. What I can promise, though, is that your communication will be noticed.
But before we go any further, let me say that not everything you dislike about the product, or not everything you disagree with about a product, is actually a bug. Sometimes it is actually a product feature, and we’re just not aware immediately of how to use it. Now, case in point. Some time ago I was actually about to write a bug report about what I thought was incorrect JAWS performance in Microsoft Excel. I was using Excel with a braille display, and I was frustrated by the fact that whenever I moved for a spreadsheet, the display was showing me only one cell of the spreadsheet. I wanted it to show the entire line, and it was giving me just one single cell.
So for a braille user, that was pretty frustrating. And I tried to turn line mode on, and it didn’t help, and it should not have. That was my mistake. And then I thought I’d write a bug report. But before writing a bug report, I thought I’d do a quick search. And the search actually revealed the answer to my question. And the answer was, oh well, let me just show you. So we’re going to Excel.
JAWS VOICE: Book 1.
OLEG: And I press JAWS Key plus the letter V to go to Quick Settings.
JAWS VOICE: Quick Settings Excel dialog.
OLEG: So we’re in the search box, and I’m going to type ”braille.”
JAWS VOICE: B R A I L L E. Two search results list box. Braille options. Excel settings.
OLEG: We have braille options.
JAWS VOICE: Braille mode, active cell
OLEG: Braille mode, active cell. And I press SPACE to toggle that.
JAWS VOICE: Space. Current row. Two of three.
OLEG: Current row. That’s exactly what I was needing. So if I was to press ENTER here to save the settings, my braille display would be showing current row. Or I could press SPACE again.
JAWS VOICE: Space. Current column. Three of three.
OLEG: Current column. So no need for a bug report here. What we discovered was actually not a bug. So whenever you run into something you don’t really like, first of all, think whether that’s really a bug, or it could be intended behavior. Suppose we concluded it’s really a bug. And now we come to three basic questions. And answering these questions will guide you to writing a good bug report.
The first question is “What? What do I see? What exactly is going wrong? Please be as specific as possible.” The second question is “Where? Where do I see it? What is the operating system? What are the applications, including application versions, where that issue is exhibited?” Again, you want it to be as specific as possible. So a report that says that JAWS is really slow with a braille display does not tell us anything.
On the other hand, you may be saying something like this: “Whenever I use this application, including this version, under this operating system, and I take these specific steps, then I see the following response from JAWS,” or ZoomText, or Fusion, or any other product for that matter. Now, this is specific enough, and this will most likely give us the information we need. But then there is yet another question, and the question is “How?” That is, how can somebody else reproduce that behavior, or that error? Answering that third question allows you to put together a series of steps. And when you’re writing those steps to reproduce, think of someone else trying to retrace exactly where you’re going. Think of describing the way, describing the path, and try to do it as clearly and as concisely as possible. I’m going to give you an example, and I’m going to ask JAWS to read through the steps.
JAWS VOICE: One, start JAWS.
OLEG: That’s obvious.
JAWS VOICE: Two, using Microsoft Edge, go to [screen reader output].
OLEG: So please note, we mentioned the specific browser. We’re going to this specific website. Now, step three.
JAWS VOICE: Three, press H twice to go to the second heading on that page.
OLEG: Well, fair enough. Step four.
JAWS VOICE: Four, press DOWN ARROW four times to go to the fourth line under that heading.
OLEG: Okay, so apparently we’re trying to find some information, or get to some specific spot on the page. Now.
JAWS VOICE: Result: JAWS speech stops, and JAWS is unloaded.
OLEG: Well, probably we didn’t need to include the phrase “JAWS speech stops.” We could just say “JAWS is unloaded,” or “JAWS crashes,” or whatever. So the author of this bug report could have just said “JAWS is crashing on me on the web,” and that we would have no way to reproduce. But right here we have a specific browser, we have a specific website, and a specific point on the page where JAWS is crashing. Easy to find, and a lot easier to reproduce than some vague bug report. Now, in this sample report, there is another line that may seem to be a bit out of place. Let me show you what I mean.
JAWS VOICE: Expected result: There should be no crash.
OLEG: Well, of course there should be no crash. Here, it’s quite obvious. But sometimes, we may not know what the expected result should be. So it’s really nice if you tell us. Now, for example, you’re going through some database application, and you have some steps, and we go for the steps. And then at the end you say, “Result: JAWS is reading the field contents.” And for us, this would probably be enough, but you’re obviously looking for something else. And we’re getting this report, and we’re thinking, “Well, what would the expectation be?” And then the next line may be saying something like “Expected result: JAWS should announce the field name as well as the field contents.” So now we know what response was expected in this particular case, and that makes it so much easier for us to handle.
Now, we may disagree with the expectations. We may say that it can be addressed by some simple setting change. But at least now we know what the expectations are. So we answered the three questions, the what, the where, and the how. And now, on to the next step, and that may seem a bit unusual to you. So you’ve put together some basic information about the issue. Now be ready to tell us a story. I shall admit that does not apply to all situations; but sometimes it may be really, really helpful to set your bug report apart from the rest. Tell us why this specific thing is important to you. How is this preventing you from doing your job, or doing your studies, or living an active life?
You may say something like, “In college I’m taking an algebra course, and we use this specific application. For the most part, JAWS or ZoomText works pretty nicely in this application. But there is that specific dialogue, or specific application feature, that I’m having problems with, and certain things are not announced in speech or not brailled, even though I assume they should be accessible. This particular issue is preventing me from being as successful in this course as I could have been.” Or you may be saying, “My favorite shopping site is this and this, and it’s generally accessible, but there is a set of buttons on that shopping site that does not speak with JAWS.”
So that gives us a personal story. That explains to us why this specific issue is important to you. And at the same time, this gives your bug report a lot more prominence. Again, that does not give any guarantee that the fix will be coming soon. However, your user story, if applicable, really gives some prominence to your bug report, making it stand out among the rest.
So you’ve answered the basic questions. You’ve put together a user story. Now it’s time to put it all together, and we’re writing our bug report. You can just create a new document in Notepad, or Microsoft Word, or whatever you like to use, and you start with your first and last name, and your product serial number. Then you include a short and concise statement of the issue. So, for example, “Whenever I go to this particular site, with this particular browser, in this particular version of JAWS, and I reach this spot on the web page, I get a JAWS crash, or I get some other unintended behavior.” So again, you present it as clearly and as succinctly and as specifically as possible. Then you put in the steps to reproduce. And, by the way, if in the process of reproducing this bug you get any error messages, it is really helpful to include those, as well.
Again, we had the short statement, the steps to reproduce, and now your user story. I go to college, I take algebra course, and this is important to me for these and these and these reasons. However, don’t send it yet because there’s one more step. Remember that we never asked you to gather your complete system information. Instead we have something, you know, a product that’s called FS Support Tool. So I’m going to bring up my JAWS window.
JAWS VOICE: JAWS Professional.
OLEG: And I press ALT+H for the Help menu.
JAWS VOICE: ALT+H, Help Menu. Command search, INSERT+SPACE, J.
OLEG: And I press F for FS Support Tool.
JAWS VOICE: F, leaving menus. User Account Control dialog. Do you want to allow this app to make changes to your device? FS Support Tool. Verified Publisher. Freedom Scientific, Incorporated. File origin. Hard drive on this computer. No button.
OLEG: So, very nice, that’s a UAC prompt. And I just say yes.
JAWS VOICE: ALT+Y. This tool will gather information required by our technical support team in order to assist you as quickly as possible. Continue button.
OLEG: Press Continue.
JAWS VOICE: Space. Please do not close this window, or the gathering process will be stopped. Cancel button.
OLEG: So this will be gathering some information. And the information will be used only and exclusively for the support request that we’re about to file. And I’m going to cut out some silence from the recording as the information is being gathered.
JAWS VOICE: Please enter your email so that our technical support team can identify your report. Email, 50 characters max edit.
OLEG: So let me write fscast@vispero.com. By the way, if you have any questions about the FSCast or anything you would like to suggest, that’s your address to use, fscast@vispero.com. I’m going to enter it here.
JAWS VOICE: Send Report button.
OLEG: Probably I’m not going to send because otherwise people in our tech support will be wondering what in the world that is. But in a real-life situation, you’d be sending the report at this point. And that would give us the support package. And now, to submit the bug report, we go to freedomscientific.com.
JAWS VOICE: [Screen reader output]. Enter. Open new Freedom Scientific high-quality video magnifiers. Braille display.
OLEG: And on the website we’re going to support.
JAWS VOICE: [Screen reader output] technical support.
OLEG: So technical support is exactly what we need. Press ENTER.
JAWS VOICE: Enter. List with eight items. Technical support. Technical.
OLEG: And let me read something from here.
JAWS VOICE: Technical Support. Level 1. Freedom Scientific offers free technical support to its U.S. customers using any of the following methods.
OLEG: And here is how you can now send your report.
JAWS VOICE: Bullet. Link. Submit a technical support request.
OLEG: You can submit a technical support request.
JAWS VOICE: Bullet. [Screen reader output].
OLEG: Or you can call.
JAWS VOICE: Bullet. Send email to...
OLEG: Or you can send email to...
JAWS VOICE: Sendmail link support@freedomscientific.com.
OLEG: Support@freedomscientific.com. And, by the way, for some of us, emailing technical support is still the best choice. So if that’s the case, you don’t even have to go to that web page. All you have to do is to write a new email to support@freedomscientific.com. Whatever you do, you already have the document you put together in the previous step, and you have your name and email address. Please make sure to let us know that you have submitted a technical support package via FS Support Tool.
So off we go. We send that report. What happens then is it gets to our technical support. The people in technical support are really knowledgeable, and in some cases they may be able to give you steps to work around the issue. Or in some other cases, they might suggest a different solution, probably using a different browser or a different version of the same website. But they can also escalate your request to product management, and it will then go to engineering and testing so that the issue may be fixed, and the fix may be tested. And then it may roll into one of our product updates.
And by the way, many of the fixes included in our February update and April update are direct results of customer input. Here at Vispero, we make it a priority to listen to people who are using our products. We have plenty of tools to communicate. We have training videos, training podcasts. We have our technical support database. We have this FSCast. You can also find us at major conferences and trade shows worldwide. So it is our goal that your voices, the voices of product users, should be heard. Unfortunately, we cannot promise immediate fixes. But we can promise that our communication is open, and we are here to listen and respond.
OLEG: Well, this brings us to the end of FSCast issue 229 for April 2023. Please keep your emails coming to fscast@vispero.com. Glen Gordon will be with you next month. And until then, have a great spring, and thank you for listening.
JAWS VOICE: Goodbye.