Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

Contents

Welcome Back for 2024.. 2

Episode 263.. 2

Blind is Fine.. 3

Jonathan Mosen Speaks With Ryan Jones, Vice President, Software Product Management at Vispero.. 6

Level Access Buys Userway. 29

A New Key Is Coming to Your Windows Keyboard.. 31

The AI Microsoft Disability Answer Desk. 34

My Favourite Podcast App.. 36

Sonos ERA300 Versus 5.. 36

Looking for a Laptop That Does Audio Production.. 38

Making the iPhone Action Button Do Double Duty. 39

The Bonnie Bulletin, Yoto, ThinkPads, Tablets, and Terminology. 40

Closing and contact info.. 46

 

 

 

Welcome Back for 2024

[music]

Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

This week: inspired by another blindness organization dropping the word blind from its name, I discuss why blind (not sorry) seems to be the hardest word, Ryan Jones joins us from Vispero to discuss the present and the future of screen reading, and Level Access buying Userway. What?

A belated happy new year to you. I hope 2024 has started well for you.

It is great to be back. I am feeling rejuvenated in every respect – physically, emotionally, all those good things. I had a great break with family and friends, but I am really pleased to be here to discuss the issues that are important to us.

If you are new to the podcast, perhaps you’ve discovered it over the new year, this is not a technology podcast. Although I guess because of my background and the interests of a lot of people who listen, we talk a lot about technology.

This podcast is, as it says in the little intro there, about living your best life with blindness or low vision. So we discuss a wide range of topics. Technology has a big impact on our lives, but so, too, do governments and blindness organizations, and organizations that we interact with. And we discuss it all here on Living Blindfully.

I think one of the things that makes this podcast special is that not only do we hear from the movers and shakers (and we ask the tough questions, hopefully the probing questions, all in a respectful way), we also hear from you, members of the blind community around the world. This is our place for discussing our issues without fear or favor, and I’m delighted to be back doing this again for you for 2024.

Thank you very much to our advertisers. And also, particularly to our Living Blindfully plus subscribers who make all this possible.

Episode 263

This is episode 263. And as we like to do for a bit of fun to get the episodes started, we look at the corresponding area code in North America, and the country code.

The North American area code 263 belongs to Montreal in Canada, where I’m sure at this time of the year, it’s a bit brrr! It’s a bit cold in Montreal in Canada, [laughs] but a beautiful part of the world.

So if you’re listening to us from Montreal, happy new year to you. Enjoy your moment in the snow. [laughs] Moment in the sun doesn’t really seem appropriate. So welcome.

And country code 263 belongs to Zimbabwe.

I miss the Zimbabwe cricket team. Not sure what’s happened to them. They seem to have fallen out of the ICC world rankings in some way.

But I do remember cricket world cups at which Zimbabwe competed. Great cricketers like Andy Flower.

What’s also interesting (one of the many interesting things) about Zimbabwe is that they have a blind cricket commentator. And I was in touch with Dean du Plessis, who is the blind cricket commentator in question, some years ago. And I can’t find him anymore.

So Dean, if you’re out there still listening to this podcast, or if anybody knows Dean, I’d dearly love to get him on the podcast.

Cricket is one of my favourite things. I spend a lot of time over the summer listening to cricket.

Although sadly, during my summer break, we only had 2020 cricket and one day international cricket. No test matches. They happen, now that I’m back at work. That is very disappointing. But there were test matches over across the ditch in Australia, all of which Australia won pretty comfortably.

But I would love to talk to Dean about how he does what he does as a blind cricket commentator. See, our horizons are just totally unlimited.

So welcome. I know we do have listeners in Zimbabwe. I really appreciate you tuning in to Living Blindfully, and I hope you get value from the podcast.

[music]

Blind is Fine

One of the benefits of a long summer break is that I get to think about things.

Imagine this. Imagine walking into a store called, let’s say, Shoe World. You walk into the store Shoe World expecting to buy a pair of shoes, funnily enough, only to find they don’t sell shoes at all. They actually sell hats instead.

Most likely, you would say, “Well, that’s a highly confusing name for a hat store. Why would you call it something like that?”

Moving on, wearing the hat you bought from Shoe World, you decide to do some volunteering to make the world a better place, and you find an organization called the National Organization for Women, only to find when you try to enrol (because you are a woman), that it’s a men’s only organization.

You’d think, “What a topsy-turvy world we live in. Why don’t they have a better name?”

Sadly, you’re dealing with a death in the family, so you have to contact the funeral director. You find one, and the company is called Transport for Life.

You think, “Well, that’s a bit nebulous.” Perhaps even a bit insensitive.

And many of us, of course, know about Monty Python’s famous cheese shop that has no cheese at all.

Well, you may consider those examples ridiculous, and they are. Yet increasingly, there is a war on the word “blind” coming from the increasingly cited leadership of organizations that are supposed to provide us with services.

In the minds of some, the word “blind” has fallen out of favour. Some see it as a pejorative in the way that certain words relating to disability, like crippled and retarded, have thankfully, fallen out of favour.

For some time around the English-speaking world, it has been customary to call teachers who provide Braille instruction and other assistance to blind and low vision children vision teachers. What signal is that sending to young blind kids?

In Australia, the major provider of blindness services is Vision Australia. Nothing is universally disliked, of course, but I’ve talked to a fair few blind Australians who agree it’s a horrible name, but they shrug their shoulders and they say, “Well, that horse has bolted now.”

I was reading an excellent article from the US over my summer break that was released to celebrate World Braille Day. It was released to promote the Cleveland Sight Centre.

So here’s an article about Braille (a very good one) the means of literacy for the blind, and it’s being published by an organization that calls itself a sight centre.

In the United States, lighthouses have been around for years, and I’m not talking about the facilities that assist people at sea.

In November, the National Council for the Blind of Ireland changed its name to, you guessed it, Vision Ireland. They still do what they’ve always done. It’s just blind people aren’t worthy of a mention in their name anymore.

When I worked in assistive technology, I was required to attend exhibitions called Site City and Site Village, exhibitions that were full of screen readers and Braille displays. Sure, there was plenty of magnification equipment, too, but that equipment existed because of a vision impairment.

I’ve commented several times on this podcast before about Site Tech Global – a conference on technology that will help blind people and those with low vision. I’ve declined to participate in it because I don’t think it’s acceptable that a sighted person should come along and seek to erase how many of us choose to identify ourselves.

Another term we hear a lot these days is people with sight loss. When you use this term, you are not describing me or people like me who were born blind. I have never lost any sight because I didn’t have sight in the first place.

Frequently, I get emails where people invite me to speak on a podcast or at an event, and they talk about it being for people with sight loss. I tell them that if that’s what they want, “I’m not your guy because I’m blind. I have been blind since birth. When you talk about sight loss, you’re not talking about me.”

Make no mistake. There is a linguistic war on the word blind, and it requires an assertive response.

Even if you’re not proud to be blind as I am, even if you don’t feel a sense of the common history, a sense of the culture and the community as I do, I hope we might at least find common ground on the fact that blindness is nothing to be ashamed of. Blindness is not a dirty word. Blindness is not a derogatory word, and there is no need for organizations providing services to us to erase the word “blind”. When they do, they’re sending a signal that those of us who choose to identify this way don’t matter.

To be clear, I don’t think all organizations serving blind and low vision clients or customers need to have the word blind in their name. But if organizations want a fancy new name, why replace ‘blind’ with something that means the complete opposite, vision or sight?

One of the reasons offered by some of these organizations is that when the word “blind” is used exclusively in an organization’s name, it alienates those with low vision who are eligible to receive services and could be assisted greatly. Such potential beneficiaries conclude that since they’re not blind, and indeed may feel some strong resistance to using the word “blind” to describe the state of their level of sight at present, the organization isn’t for them. This is a genuine and real concern. It’s a different situation for consumer organizations who are more likely to promote a more positive message around blindness.

But when you’re a service provider, you need to meet people where they are on their vision loss journey. And yes. In this case, it is usually a vision loss journey. But I don’t think that the answer is to make one group of people feel more included by linguistically ostracising another.

In this regard, (and I say this as someone who had nothing at all to do with the decision), I do think New Zealand has got the balance right. The primary service provider in this country used to be known as the Blind Foundation and various other variations on that theme, but changed its name a few years ago to Blind Low Vision New Zealand. I think that works, and it’s fair.

If the organization had become Vision New Zealand, (and I feel sure there was a school of thought that that’s what should happen, although I think, mercifully, there may have been a political party that beat them to that name), I would have de-registered. I’m blind. Calling something Vision anything when providing services to blind people is exclusionary.

Similarly, the provider of services to blind and low vision children in New Zealand is called Blind and Low Vision Education Network NZ, or BLENNZ for short. Good stuff. Everyone feels included.

For some of us, the word “blind” is part of our identity. Our identity is worth speaking up for.

To anyone in agencies considering joining this destructive cause, I urge you not to alienate one group of people when including another. It is absolutely possible to include everyone who needs to be.

Many of us feel like we can’t make a difference. All of us absolutely can.

Hopefully, it’s not too late for New Year’s resolutions, because these things are pendulums. Get involved. Get on the boards. Do what you can to be in a position of influence, and we’ll get the pendulum swinging back.

The bottom line? Blind is fine.

Ryan Jones from Vispero is next on Living Blindfully.

Advertisement: I want to remind you that transcripts of the podcast are brought to you by Pneuma Solutions. We really appreciate them doing this because not only does it mean that the podcast is fully accessible to our deaf-blind community (and we enjoy their input very much). As is often the case with many accessibility benefits, others enjoy the transcripts as well. So thank you, Pneuma Solutions.

In collaboration with Blind Information Technology Specialists, that’s BITS for short, (That’s an affiliate of the American Council of the Blind.), Pneuma Solutions are currently offering a RIM special for BITS members.

Anyone who is an active BITS member will get an additional 33 minutes a day on top of the existing 33 minutes that everybody gets for RIM. After my summer break, I can just about do that maths and work out that that means that BITS members get 60 minutes of free RIM a day.

You don’t even have to have a subscription. This all kicks off on the 1st of February, so you’ve got a bit of time to join BITS if you would like to, and take advantage of the membership.

If you’d like to know more and to check out RIM in general, go to PneumaSolutions.com. That’s P-N-E-U-M-ASolutions.com.

[music]

VoiceOver: Since you’re listening to this podcast, you already know that Living Blindfully has a substantial, engaged global audience. We’re heard in over 110 countries and territories.

That’s an opportunity for you if you have a product, service, or podcast you’d like to tell our audience about.

Get in touch with us about advertising here on Living Blindfully. We’ll tailor an advertising campaign to suit your message and your budget.

Find out more and get in touch by visiting LivingBlindfully.com/advertise. That’s LivingBlindfully.com/advertise, and share your message with the Living Blindfully community.

Jonathan Mosen Speaks With Ryan Jones, Vice President, Software Product Management at Vispero

The only thing that’s constant in the world of technology is change.

Companies and products come and go, yet JAWS has endured. It’s been synonymous with screen reading since the late 1980s, and we’re fast approaching the 30th anniversary of JAWS for Windows. That’s a significant tribute to all who’ve kept the product relevant and thriving.

Freedom Scientific has its own media, including podcasts outlining new features and providing training.

So as Ryan Jones who is Vice President of Software Product Management at Vispero joins me, I want to go beyond the information that’s already available, and talk about some deeper strategic issues relating to screen reading in the present and the future.

Jonathan: Ryan, I really appreciate you coming on Living Blindfully. Thank you so much. It’s good to connect with you again.

Ryan: Thank you, Jonathan. Likewise. It’s been a while since we’ve had a good chat, so I’m glad to get together.

Jonathan: So you taking on this role is, in itself, a significant change, with Eric Damery having held it for so long.

I suspect that you will always look back with particular fondness on Jaws 2024 because it’s the first one that you oversaw. Are you pleased with its reception?

Ryan: Yeah, I am, and it’s been a really good journey for me. And you’re right, the 2024 release was the first one that was under our team.

But the good thing is our team, they know how to do things. They just move along.

So Eric retiring and me stepping in, we made some changes here and there. But by and large, the team functions very autonomously.

But yes, I’m really happy with 2024. It’s been very well received.

I think we had a good balance of some new things, some work under the hood that we’ve been doing, especially in ZoomText, so I’ve been very happy with it.

Jonathan: Yeah, you got some good bread and butter, meat and potatoes screen reading in 2024.

The Split Braille feature is fantastic.

And obviously, the Face in View is great. It would have been good to have it during the height of lockdowns. But it’s here now, and it’s still very useful.

I’ve not personally experienced this, but I have seen some people on social media say that they’re experiencing some serious focus issues on the web relating to the Virtual Cursor. Is that something that you’re aware of?

Ryan: There’s always a number of things going on with the web. Specifically, I know we’ve had some challenges with tables lately that the team has been working on. I think the December update had some of those.

But it seems like a constant battle. I mean, we could see even with Chrome updating itself every couple of weeks, sometimes that’s thrown minor monkey wrenches into things, and we have to change something. So that’s partly why we have a pretty aggressive update cycle to put an update out about every 8 weeks or so.

But I always tell people, “Whatever specific problems you’re having, let us know. Let our support team know. Because that’s our conduit to hear from people if we’re not watching social media all the time, for example.

Jonathan: Sure.

People get a bit cynical. I mean, what happens when people file a tech support request? People get worried that it will go in the bit bucket, and that nothing really happens.

Are you able to track through the anatomy of a tech support request?

Ryan: Yeah. So when someone files it, we have a case that gets entered in our backend system. We use SalesForce for that. And then, our agents who are all of JAWS users themselves pretty much are watching for things and looking for trends.

I get a weekly report from our director of tech support, and he’s always kind of watching out. Are there trends that are popping up? Are we hearing the same thing from multiple people? And so that’s my key question to them. What trends are we hearing?

But when things come into us, if it’s not something that we can identify an exact reason for happening, or if we think it’s a bug and it needs more investigation, then it gets tagged, and our engineering team takes a look at it. So our tech support and engineering teams have a good system set up for how they tag and code things, so that the engineers know when to go look and explore, and then try to reproduce the issue.

And our product owners are looking at that and saying, “okay. This is something that’s been found. We’ve definitely been able to reproduce it.”

And now, we need to get it on our backlog to get it looked at and fixed.

Jonathan: And external relationships at your level, at the level of some of the senior engineers, must be critical because I’m sure that sometimes, you just have to reach out to a developer of a particular app and say, “Hey, have you changed anything here, because it’s not performing with screen readers the way it used to?”

Ryan: Yeah. We especially do that with the larger tech companies like Google and Microsoft.

And I’m sure a lot of people remember back in late May, early June of 2023 when an issue came out through Chrome and Edge that was breaking especially some older versions of JAWS and the Virtual Cursor. And that was a great example of … We have a really strong relationship with the folks at Microsoft and Google, and literally to the point where we can send them an email or almost call them, and they’ll respond within a matter of minutes or hours and we can start a dialogue with them.

So there’s a lot of history there. There’s a lot of people in this industry that many of our team have known from various roles that are now at some of the big tech companies. So there’s always this kind of personal history and relationship, that we’re all in this for the same reason – trying to make life easier for those of us who use screen readers.

Jonathan: JAWS has always had competition.

I can remember some pretty edgy marketing for its time when Henter Joyce which is the company that started JAWS, did a comparison of JAWS and VocalEyes, which was the DOS screen reader for GW Micro.

Today, though, you’re surrounded by 2 principal competitors that are free to the user. You’ve got NVDA which is open source, and of course, Narrator which is built right into the operating system.

With those 2 free options available, what’s the value proposition for JAWS in 2024?

Ryan: Well, there’s a couple of things.

One, we try to go above and beyond and really focus on productivity. And NVDA does this too, so I’m not taking anything away. Especially if I think of Narrator, Narrator is pretty good at getting the job done – reading things on the screen.

But where JAWS really excels, and excels even from an NVDA perspective is the extra things, the things to try to get valuable information to you when you need it – things like the notification history, and being able to filter out certain things that, for those of us who are working or getting information all the time and Windows notifications are coming in, and app notifications, to be able to filter those out is a good productivity enhancement.

Same thing for the Face In View feature. That’s taking screen reading to a different level. Not just reading the screen, but trying to come up with a tool to help us be better at what we do on the computer.

And that’s really one of the areas that our team focuses on is we try to think about things. What are core screen reading things that we have to do the bread and butter piece? But we’re always looking at what’s going to make things better for us, and then for users to access it? So I think that’s really a huge area that differentiates us.

And then, the other is just the community, the support that we have, the training and outreach that our team does. I mean, they’re pushing out podcasts. They’re pushing out webinars. They’re doing different community events online. Things are going on every week. And that’s been a core philosophy for us for many many years is getting that training out there, and trying to get into the hands of people, to help people learn how to use it, whether you’re a new user, or whether it’s somebody who’s been using it for a long time and needs to learn about a new feature, or whether it’s a teacher or another trainer, or a family member, for example. So those are, I think, a couple of the key areas that really set us apart.

Jonathan: Because the game has changed so much, hasn’t it?

In earlier times, the good/bad old days, [laughs] the secret sauce was the off-screen model. And in fact, the off-screen model that was developed by Glenn Gordon and others was so good, Microsoft actually bought that original Henter Joyce off-screen model for Windows. And that was the kind of hackery that had to go on then so that a blind person could be presented with information about what was on the screen.

Now, Microsoft and other providers are just exposing that information. So the barrier to entry for competitors is much much lower now.

Ryan: Yeah, I think it is, and it’s changed the world. Like I said, the way we think about things. And as you mentioned, the focus back then was just how do we even know where we are on the screen? All the nomenclature was being developed. What do we call this thing, or that thing, and coming up with those standards.

And so now, in my opinion, we get to focus on things that are a lot more fun, things that are just helping take us all to the next level of being productive and using the computer. I’m glad to be a part of it in this space, I think personally, than if it had been many years ago for that reason.

Jonathan: I’ve got some questions about each of those competitors, and I’ll start with NVDA.

NVDA has acquired quite a loyal and enthusiastic user base. And I’ve noticed a pattern where new features, which I’m sure that your engineers and the product management team would have spent many hours defining, refining, and testing are largely duplicated in short order by a free NVDA add-in. And of course, unless Vispero has patented something, there’s nothing legally wrong with that at all. I know some years ago, Freedom Scientific did file some software patents.

Does it frustrate you to see this pattern, in the sense that you are essentially donating intellectual property to the wider industry?

Ryan: I think as we continue to move forward in screen reading, as we continue to look for what’s going to be new, yes, there are things that NVDA does that JAWS has done, and I think that will always continue to be the case. And I think it’s actually a positive sign because it also tells us that we’re on the right track, and that we’re meeting the needs of people. So I kind of look at it as a positive for that.

Obviously, Freedom Scientific, Vispero is a business, right? We have a business model for JAWS and for the financial aspect of it. But at the end of the day, that business model is around making blind and low vision people productive in what they do.

So to that end, we’re not out there trolling and looking for patent infringement. We actually file a ton of patents, but we’re not looking for patent infringements, or to shut anyone down.

But as I said, we’re looking at it from the perspective of we’re doing something right, if other people are also duplicating it. And really, at the end of the day, Our goal is to make people more empowered to use the computer, and NVDA plays a part of that. Narrator plays a part of that. Other screen readers play a part of that. So yes, we have to make money, but there’s a reason that we’re all doing this.

Jonathan: When I got my first iPhone, one of the things that wowed me was that there was one easy-to-use repository for third-party apps. If I needed an app for my phone, I knew exactly where to get it.

And that was in stark contrast with the Nokia smartphone that my iPhone replaced, where I’d had to go and search online for the website of a product that I was looking for, and then manually download the app.

I looked at the NVDA add-in store the other day, and I had this epiphany. [laughs] In terms of seamless third-party support, NVDA is Apple. JAWS is Nokia.

I’ve been using JAWS since the DOS days, and there’s always been a thriving community of script developers adding real value to JAWS and therefore, to my productivity. Now, I think I have enough fingers to count those who are truly active in the same way.

Why is it so easy for me to browse and add enhancements to NVDA, while JAWS doesn’t seem to want to promote one of the features that was once its biggest competitive advantage?

Ryan: Yeah. It’s actually a really good question, and one that I’ve asked myself when I took this role over. And the answer that I came to is, … Why didn’t we do this earlier? I really can’t answer that piece.

But do we want to open this up for more collaboration and to use content contributors? It’s not that exactly, but to add more community making enhancements, and doing things?

The answer is yes, and myself and our product leadership team are all in agreement that that is the direction that we need to go. The question will be exactly how and when we get there, but there’s no question.

And NVDA, I think, was just ahead of that curve. And the open source world sort of helps promote that, I believe.

And Jaws, sometimes being steeped in history can be a challenge, and it’s harder to change things and it’s harder to get out of a philosophy of doing one thing or another.

So we all agree that it’s a good way to go, and we intend to get there, and we’ll just kind of have to see how that plays out over the coming months and years.

Jonathan: Right. ’Cause it’s new on NVDA, relatively speaking, but the model itself is not that new. In 2008, Apple came out with the App Store. And actually, I think GW Micro had an App Store-like experience for Window Eyes quite some time ago, so they were pioneers there.

Ryan: I think they did, yes.

Jonathan: There seems to be a buzz about NVDA, sort of a collective sense of contributing to something to make the world a better place. Do you think that that sense of community is something that a commercial entity (and you mentioned the commercial imperatives earlier). Can a commercial entity just not pull that off in the same way that an open source product can?

Ryan: I think we should be able to. Whether we’re gonna create it exactly as NVDA or any other, I don’t know ’cause we all have our different cultures in a company. And open source, you’re gonna have a little different culture. But the whole idea of community and people contributing to something, that should not be exclusive to something that’s open source.

I mean, I’ve thought of a number of ways that we wanna get our community more involved, and we did the event back in the fall here in the Northern Hemisphere. You covered it – The Next Big Thing event. And that was an event that was not a culture shift that wasn’t adding some new capabilities into the software per se, but the goal is trying to work in the direction of bringing in the community and building some engagement, and that has been one of my objectives since I started.

There’s a long way to go. We’re nowhere near there. It’s a culture change for us as a company to move in that direction, but I absolutely think it’s the right way to go.

And being someone who has used screen readers my entire life pretty much, I see the value in it from the personal connection as well.

Jonathan: What was the big thing that was ultimately chosen?

Ryan: So we had 3 finalists, and the one that won was a young lady here in the US who wanted to see us use AI technology for our image descriptions.

It was actually really close. So we had several judges that participated with us. We had the audience voting. And mathematically, it was the closest score that it could have been between the first place, second place, and third place winners. So it was a really fun event.

I think that there was a clear theme in the submissions that we saw, which was certainly around leveraging AI and large language model technology, and those were things that we’re absolutely looking at and working on right now. And you’ll be hearing about some of that stuff literally within weeks (not months or years) of some things that we’re doing in that space.

Jonathan: So there’s going to be a challenge in some respects, I imagine, because I’m sure that there would be an easy way to put a series of JAWS scripts together where somebody could potentially add their own ChatGPT API key, and then they would be paying for that. Essentially, they just top up the number of calls and they’re good to go.

But if you were to build it in in the same way that PictureSmart is built in, that could actually be quite costly for the company.

Ryan: Yeah.

The theme for me, … because I saw some of the stuff that was done in NVDA, which was really clever. But it does require you to … you’ve got to do some manual steps. You’ve got to have an OpenAI account. You’ve got to be able to go on there, top up your account. You’re now drawing down from it every time you use it. So you might have this kind of fear of, “I don’t want to use it because I don’t want to spend my money.”

In JAWS, we’ve not tended to have any user features like that be pay-to-play, so to speak, where you have to add on your own account like that and pay for it.

So my goal with how we’re going to work with large language models right now and other AI technology is it’s just going to be part of the product.

And that is one reason I think that there’s kind of the philosophy of an App Store within NVDA.

One of the things that we have tried to do is build as much into JAWS, right or wrong, right? You could argue the other way. But we try to build as much into it, so that people can access it easily to try to reduce, break down the barrier of entry for people.

For AI, I don’t want this to be something that people have to go out and figure out how to fund it themselves. So that’s the bridge that we’re trying to cross. The technology piece is fairly straightforward, but it’s kind of getting this model set up where we can afford to do it, but our users can also afford it as well. And we don’t want to nickel and dime people with it.

Jonathan: I don’t know whether you’ve conceptualized a JAWS App Store enough to answer this question or not but obviously, there are some commercial script developers out there. Would you envisage some sort of revenue-sharing model like Apple where somebody could seamlessly, very easily purchase a third-party script package and Vispero would take a cut of that revenue?

Ryan: That’s one of the reasons we really haven’t done it, to be honest, because it would be a lot easier if it was all free. But we certainly never want to undermine what other people are doing, and the revenue that they’re generating with writing scripts and enhancements to JAWS. So that has to be a part of it, no question.

Jonathan: I use a collection of third-party scripts, and it seems that there has been an increase in compatibility issues with them since JAWS 2024 came along.

Does that further illustrate the point that more might be done to build a thriving ecosystem of third-party developers around JAWS, where there may be mutual benefit and closer cooperation between Vispero and those developers?

Ryan: Yeah, I think so. We have a good relationship with a number of third party developers. They don’t work for us, obviously, and we can’t necessarily test everything that they’re working on. A lot of them are a part of our private beta team, so they’re able to test early on when we’re making updates to JAWS, for example, and they give us feedback about issues that they’re finding, and we try to prioritize those as best we can. But yeah, certainly more of a community approach would probably smooth out some of those bumpy edges.

Jonathan: Looking at Narrator, obviously, there’s the convenience factor there of just having it right there in Windows with nothing to install. And some time back, they actually changed their command set to be quite JAWS-like in many respects. [laughs]

There’ve been rumors that Microsoft has been working on a scripting language for Narrator. If that happened, would that pose an existential threat to JAWS?

Ryan: I don’t know that it would pose an existential threat. I mean, it would be certainly something. Anything that another product does forces us to take a look at it, to step up our game and make sure that we’re providing extra value to people. So a scripting language in itself, … I don’t know if that’s gonna change the whole, pardon the pun, but the whole narrative for Narrator, but it certainly would be something we would pay attention to.

And even without it, like I said, we’re always looking to make sure that we’re providing above and beyond what the other options are because we have to provide something of value for people where money’s changing hands for our screen reader.

Jonaathan: What is your sense of Microsoft’s attitude towards third-party screen readers? Do you think they want them to thrive or long-term, would they rather that Narrator be the only game in town?

Ryan: I can’t speak for Microsoft, but I can say that we have a really good relationship with Microsoft. We get a lot of support from them. They ask us a lot of questions. We have philosophical discussions around screen reading that are mutually beneficial for all of us whether it’s Narrator, whether it’s JAWS, whether it’s NVDA.

They have a very active partnership program that we’re a member of, other assistive technology software manufacturers are a member of. So there’s a very good open community and dialog around third-party screen readers and other AT software that Microsoft cultivates. And so, you know, I can’t speak for them but I would say that they definitely like to have that open environment to have discussions in.

Jonathan: I want to come back to something you mentioned when we were talking about the value prop for JAWS in 2024 because it’s kind of a pet thing of mine.

One of the things that I’ve always appreciated about JAWS is its respect for productivity. Literally every syllable of superfluous speech slows us down, and that unnecessary verbiage, it all adds up.

Microsoft’s now much more aware of accessibility than it’s ever been, but my sense is that there’s no corporate understanding of the productivity drain that excess verbosity can create. And as you mentioned, JAWS has provided tools in recent times to give users control of some of this verbiage, like the excellent control over notifications.

Does Vispero ever have any ability to influence the text that Microsoft is sending to screen readers? Do you get consulted about what blind people might want?

Ryan: I would say that on occasion, we do.

I mean, you can imagine how many products there are in Microsoft and how many people are working on those. And even from an accessibility standpoint, it’s, yeah, I have no idea what the number is, but it’d be enormous.

But the product groups that we do have good relationship with, we do have that dialog, and they’re completely open for our feedback sometimes which is, “Hey. I’m glad you’re doing some of these accessibility techniques on these controls, but tone it down. It’s way too much information.” And so we’ve had really positive dialogs with them.

And sometimes, they’ll say, “Yeah. We didn’t even realize it was doing that.”

So I would say we have a small amount of influence. But it’s almost like raining in the ocean in a way that there’s so much out there that Microsoft does, and other tech companies as well, that the influence that we have is probably small in the grand scheme of things.

Jonathan: So this is the danger that I see – that in the past, a lot of scripting customization was necessary on JAWS’s part. But now Microsoft is handing it to Vispero, and I get concerned that the experience that one has as an end user doesn’t vary that much anymore between screen readers because Microsoft is essentially packaging it up and handing it to you on a silver platter.

When do you decide that you’re going to invest some important engineering resources in overriding some of this horrible verbiage that we have to endure?

Ryan: We tend to focus on probably the larger Office applications, for example. I mean, there’s so many things. We have to narrow down where we focus on. So we tend to really focus on the browsers, the primary Office suite – the Word, Excel, Outlook, PowerPoint, Teams obviously is a big one.

So where we do testing in those applications and where we see there’s things that are slowing down productivity, usually, we try to approach that topic with the manufacturer, whether it’s a group at Teams, or Microsoft, or whomever.

But we’ll also look at – is there something we can do, and is there any bad ramifications to doing that? Because sometimes, we may want to filter out something that the application is saying, but that may undermine something else that we didn’t know about.

So we have to kind of take it a little bit carefully. And really, the philosophy that was set was rather than try to take all of it out ourselves and decide who gets what, let’s provide some tools to let people do that themselves and decide.

Jonathan: Yeah. That’s the dilemma, isn’t it? Because one person’s excess verbiage is another’s essential information.

I guess, an example of this is Microsoft Edge. I would not use Microsoft Edge. I simply would not use it if JAWS hadn’t given me the ability to get rid of loading page, load complete, and all the other nonsense that JAWS says.

But some people like that. So it’s difficult to know what to do in that situation, I’m sure.

Ryan: Yeah. I’m the same way on that. That was one of the first notifications that I created in Notification Manager was to do that.

[laughter]

Ryan: But I also have met other people, like you said, who, they love it. They love hearing that the page is loading because it’s their way of seeing the old spinner that’s there on the screen that tells someone when it’s loading. So you’re exactly right. That’s the challenge that we have.

Jonathan: And I say to them, “Dude! If the page wasn’t loading, it would be speaking, right?”

Ryan: [laughs]

Jonathan: I mean, if the page wasn’t loading, it would be talking to you. [laughs]

Anyway, there is no getting away from the fact that to employ developers, and tech support people, and all that goes along with creating a sustainable product, it costs money. That’s the commercial model that you have. But that has put JAWS out of reach of some whose lives could be transformed by it.

One of the welcome changes in recent years has been this introduction of the annual license, which means that people can subscribe essentially to JAWS for the price of that annual license, and then they just renew it every year.

One of the questions I got asked to ask you by several people around the world is, why is that option not available outside North America?

Ryan: It’s a great question. It’s one that I actually asked myself a while back, and I have to ask myself that all the time.

I would say there’s no reason that it can’t be outside of the US.

The distribution model for JAWS has been through dealers. And so we have a network of dealers, and those could be small, medium-sized companies who are distributing JAWS.

A lot of times, they’re localizing. So they’re doing all the translation for JAWS or ZoomText into whatever language.

There’s over 40 languages right now that JAWS is localized in. A number of those are done by the dealers who are then reselling JAWS, so that they can have a funding revenue model. And they’re oftentimes providing support.

So all of our JAWS that’s sold outside of the US is going through that dealer network. We don’t actually sell directly to anyone outside of the United States.

So one of the challenges that we have to overcome (and I say challenge, not a barrier because it is just a challenge) is we have to work through that model. How do we do this the right way to set up an annual type of subscription in another geographic area that will not only be financially attainable for those individuals using it, but also not undermine the dealer network that we have and the value that they provide to people of those different countries? And so it’s not a barrier, it’s a challenge, and it’s one that I am looking into, and there’s no way we’re going to be able to do it everywhere and all at one time.

The good thing is a lot of the dealers outside of the US, they do have lower cost options for people that are paying out of pocket. But it’s not necessarily everywhere, and not for everyone. So I would love to see us have a subscription-based system like that that we can deploy in other countries, and I think it will happen. I believe it will happen. It’ll probably happen one geographic area, one country at a time, as we work to try to make sure we’re meeting the needs.

It’s not as simple as just throwing up a website, or localizing or translating the portal into another language. There’s a ton of other things that have to go into making that happen.

Jonathan: And I get the localization argument. But there are a number of English-speaking countries where this option is also not available.

And it could be, I guess the challenge is that distributors do add considerable value in certain use cases. For example, if a distributor needs to go and install JAWS in a workplace where security might be quite tight, or people need particularly intensive tech support because they’re not particularly savvy, they’re not familiar with computers yet, that is real value that’s being added.

But we have many listeners, I’m sure, from around the world in English-speaking countries who would just like to be able to go to the Vispero website, download it, purchase the annual license, and if they have the occasional tech support issue, they’ll call or they’ll email.

And I guess this is just one example, it seems to me, of where JAWS’s longevity is a disadvantage. Because all of this dealer network that is around where there seems to be no exceptions offered predates the facts that you can now go online and download something, and transact business online.

Oh, I remember the original set of JAWS floppy disks, right?

Ryan: [laughs] Exactly.

Jonathan: But can we get the dealer network out of the floppy disk era and give English-speaking countries this license?

Ryan: Yeah. You took the words out of my mouth.

It’s a challenge from a system that’s been in place for a long time, and it needs to change. And it has to be changed thoughtfully, like I said, because people’s lives are depending on these companies that are distributors, their funding scheme. They’re oftentimes family-run businesses.

So we have no interest in undermining what they’re doing, but your point is exactly correct – that there’s a way we can still do this in the modern world. I mean, this isn’t an unsolvable problem. It’s a problem that other businesses have solved, and we just have to solve it. And it’s something that I’m committed to that we will solve. We’re just not really fast sometimes at doing things, and that’s just the honest truth.

Jonathan: Yeah, because economies change, circumstances change. And businesses that were viable 30 years ago may not be viable anymore. And that’s just a fact of progress, right?

I mean, I think that the most important thing is that the customer is put first, and that you can get this product in the hands of as many people as possible.

Ryan: Yeah, absolutely.

Jonathan: We’ve been talking recently on Living Blindfully about the digital poverty experienced by blind people in developing countries. Obviously, we’ve talked about the free solutions that are available, but neither of them can do all that JAWS can do.

Does Vispero give this any thought? I realize it must be quite a dilemma, given the commercial business model that you have.

Ryan: I’ll disclose one thing. I’m not as familiar with some of the commercial side, especially when we get out of the US, as some of our other teams. So I don’t wanna say anything that’s incorrect.

But I do know that some of our distributors outside of the US have a different pricing model, and we allow them to use a different pricing model. So we provide them a different pricing model, so that they can provide some lower cost options. There’s plenty more to be done.

I think the bottom line is this kind of ties back to what we just talked about, with having a low-cost subscription type of model for people. I mean, the traditional way of a perpetual license and paying a higher fee up front for that, that obviously feeds right into a digital poverty divide. Whereas a low-cost subscription pay-as-you-go, pay-when-you-need-it type of model helps break that barrier down.

So I think we can move in that direction by doing what I talked about earlier. It’s just being able to provide more of a subscription-based model to our customers.

Jonathan: Yeah.

One of the things that really impressed me was the country licenses that were negotiated sometime back. Colombia and Hungary were the only ones that I’m aware of that were negotiated. There may possibly be more. That seems like a really innovative approach, where blind people who cannot afford Jaws, even if it was really cheap, can have it as a right because the government has paid for it.

Ryan: Yeah, and we’ve seen that be successful. I mean, there’s obviously costs to the government for that, but they’re also seeing more productivity out of people, and so they’re seeing the value in that.

And we’ve been in discussions with other countries as well, and I think you’ll see more of that. Those are often long projects to come from idea to fruition, as you can imagine working with the government.

But even in the US, we have this with certain government agencies, and we have some other government agencies that we’re in discussion with right now on distribution and providing JAWS, whether it be to employees or constituents. And so there’s certainly really good value in that, and people are recognizing that that investment will pay off for them later on.

Jonathan: Yeah, you’re right.

It’s a strong economic argument because if people invest in the capacity of blind people, then they’ll go out to work, and they’ll pay taxes, and that country license will have paid itself back in no time.

Ryan: Yeah. And it’s even creating jobs. I mean, even just to support some of those country licenses, there’s companies hiring people to do technical support in those languages, so there’s kind of this whole even small-scale economies that flow off and around that even in addition to just the potential of employment now for blind people in those countries.

Jonathan: Having done jobs similar to the one that you have now, I know you’ll have many people bending your ear about what should be in the product, and I’m sure that new shiny things are music to the ears of the marketing team. But that does seem to mean that features that were great as a first cut don’t get any more love.

Research It, for example had a following, and APIs for a raft of services that might be used in a beefed up Research It package still exist. And we know that because Leasey from Hartgen Consultancy seems to be filling the void and doing quite well. In the case of Research It, it’s been abandoned altogether rather than updated.

Flexible Web is an incredibly powerful feature, but it hasn’t been given anything new since its initial release.

I’m sure other people have their favourite features where they’d like to see refinements, and I think refinements are a different category from bug fixes which do come along very regularly now, and they’re welcome.

So how do you, (This is, I guess, the perennial question for any product manager.) how do you strike that balance between new features and improving existing ones?

Ryan: Yeah. If anybody figures the exact right answer, let me know. [laughs]

Jonathan: Yeah.

Ryan: That’s the question our team asks every day.

The main driving factors, … One thing is we make a lot of use of telemetry in the software, and some people have heard us talk about this some versions ago. We added telemetry, which is basically anonymous usage data. So if you’re opted into telemetry as you’re using different features, it’s telling us what features people are using.

There’s nothing unique. We can’t tell who someone is, or get any personally identifiable information out of it.

But we can see, “This is how many people used skim reading in the last 90 days.”, or “This is how many times Flexible Web was used.” This is how many times JAWS Tandem was used.”, for example. That data is really valuable for us because that’s one way we can help prioritize things – by seeing what people are using.

Whereas in the past, you would have to just kind of intuitively know, or ask a bunch of people, “Hey. Do you use Flexible Web still?” “Do you use Skim Reading?” So we have some more analytical data that we use to help prioritize.

And then also, there’s still kind of this art to it, I think, that we all have to do. It is just kind of that art of the feel of what’s the right balance.

We’ve got a list of shiny features that we think would be really great a mile long right now. It will never get to them. If we had 100 developers, it would take us years to get to them because we have to balance ourselves with the core functionality of JAWS.

And then, all the fixing for every new feature, as you alluded to. You’ve got to fix stuff with it. Something’s going to break. Something’s going to change in a browser or in the application. Something’s going to change on the websites, as it was the case with Research It breaking all the time, because the underlying websites that Research It was relying on kept changing.

So you keep adding new things, and you keep having to maintain them. And then, all you’re doing is maintaining things.

And so, that’s a huge challenge for us. I think sometimes we get it right. And sometimes, we don’t. And we try to be nimble enough if we don’t get it right. And if we’re hearing a lot of pushback on neglect that we’re doing, then we do try to take that seriously and change our course on that.

Jonathan: Yeah, because that telemetry data is subject to interpretation, I would imagine. Are they not using a feature because no one wants the feature, or are they not using the feature because it’s inadequate?

Ryan: Exactly. Yeah, yeah, that’s right.

Jonathan: You mentioned in a comment for Living Blindfully recently that remote access is going to be a focus for JAWS in 2024. But you were quite noncommittal, I have to say, about RIM, specifically, which has become a valued tool for many people. Do you intend to compete with RIM, or work with RIM?

Ryan: You know, I mean, you could say, “Does JAWS Tandem compete with RIM?” I don’t think that I’ve ever looked at it as competing with RIM. Our remote access feature in JAWS Tandem is a functionality of JAWS that we think is very valuable.

We’re not in the business itself of remote access capabilities. That’s kind of what RIM’s bread and butter is. So we’re not looking to take that away from RIM. What we’re looking to do is make sure that the feature that we have that many people have used is as good as it can get.

And then where we need to partner with RIM to make sure that our users are getting a good experience when using RIM, then we need to do that, and I’ve said that we will be doing that.

And that is the case. We will be working with RIM to see what’s the right approach.

I think we have some thoughts and ideas, but we have to have dialog with RIM, and then see where does this take us. What’s the right approach?

But there’s no animosity there. There’s nothing. There’s no subterfuge, basically. It’s just a matter of getting the work done, and us working on the things that are core to JAWS.

And we do have intentions of bringing Tandem forward. Tandem is another one of those features that was very novel back in its day.

But it’s getting older now. It’s getting dated, and so it’s going to take effort to help bring it forward.

But I still think there’s clearly a place for what RIM is doing. There’s clearly a place for what JAWS is going to do, and I think they’ll fit together.

Jonathan: One of the things I really do appreciate is that I can log in using RIM to a sighted family member’s computer, which doesn’t have any screen reader on it at all, and help them, provide remote assistance in some way, set things up, that kind of thing. And it’s great. It actually is quite empowering for me as a blind person to provide that sort of assistance to a sighted family member. But I got to use NVDA for that.

Is that one of the features that we might be able to see in future, where I can run JAWS and get feedback through RIM on a PC where no screen reader is running?

Ryan: Yeah, I would certainly love it to be.

If you want to use NVDA, fine. But I always don’t want to ever force people to use NVDA.

Jonathan: Right.

You mentioned developing JAWS Tandem some more. Do you think we might be looking at unattended access for Tandem, which is a feature that’s been asked for since Tandem began, but there has been some resistance at the Vispero end to give us that?

Ryan: Yeah. I think the resistance was probably around a fear of security. And back when Tandem was coming out and kind of in its early days, that may have been legitimate.

The modern landscape now is there’s so many products that can do this, and unattended access is much more normalized, I think. So I would say it’s unequivocally on our list of primary things that we want to do with Tandem because it just provides a lot of ability to people like yourself, and others who are doing training or providing support.

I use JAWS Tandem myself to help others sometimes. And I’ve even been sitting there saying, “Man, I wish I could just reboot this machine right now and have it just reconnect.” So it’s definitely got to be a part of what we do when we update Tandem.

Jonathan: Yeah. [laughs]

So do you think that this is all potentially in JAWS 2025? Would we have to wait for that?

Ryan: I don’t know. I mean, our timeline is always changing, so I really never like to say exactly when something’s gonna be ready because who knows what’s gonna happen tomorrow? But it’s things that we will be working on in this calendar year.

The good thing about what we do now is that we don’t have to get everything done in the first release of a program, right? We’re putting updates out about every 8 weeks. So if something doesn’t make it into 2024 or the initial 2025, it’ll come out at an update.

And so I’m trying to kind of get us thinking in that way, as a company, rather than, “We have to focus on all our biggest things in the initial release.” Because the fact is, we need to be able to be more agile and shift our focus from time to time.

Jonathan: I totally applaud you for that. And I think that’s something that perhaps many people don’t realize – that they might look at what’s new in a major annual version of JAWS and say, “Well, that’s a pretty short list.” But actually, if you look over the calendar year at what has changed in JAWS, there’s usually quite a bit now.

And that’s just a response to how quickly technology is changing. You can’t afford to wait a year to put it all in there.

Ryan: Yeah. I mean, this is philosophically the reason why so many companies move to agile-style development, where they’re working in 2-week sprints and releasing updates quick, because you can be working on a feature for 6, 8, 9 months and then the world around you shifts, and that whole thing is no longer relevant.

I mean, think about when AI and your ChatGPT and other large language models hit the scene a year or 15 months ago. There were things people were working on for a year or two, probably, of knowledge, aggregation and things like that that just poof, kind of went away. And all that work was probably put to waste.

And so being agile kind of helps you move in smaller chunks, and it takes some of that risk of working on something to only have it thrown away because the world shifted around you.

Jonathan: One of my most viral, … (I think it was still tweets back then.) in recent years was when you announced the JAWS for Kiosks program and that you had a partnership with McDonald’s. And I put a tweet up there and I said, “Big news! JAWS now works with Macs. Big Macs, that is.”

Ryan: [laughs]

Jonathan: So, JAWS for Kiosks. This has provoked a bit of discussion on Living Blindfully because these are Android-based kiosks, and we’d like to know. Might we see a Jaws for Android at some point competing with TalkBack?

Ryan: It’s a good question.

So the original JAWS for Kiosk was only Windows-based because that was the whole place where we lived in.

But about a year and a half ago, almost 2 years ago, we saw that so many kiosks are living in the Android space. And so then we developed the Android version of JAWS that’s now powering our kiosks.

Now, McDonald’s is actually still running on Windows, but there’s a lot of other ones that are now running on Android. So that’s kind of the genesis of where it came from.

Whether you’ll see us release this as a regular product for Android, I’m not sure. I mean, I think we don’t really see the business model is there right now.

I mean, if we think of the economy of scale, the real question is, “What would people be willing to pay?”, I think is the bottom line. And in most cases, we’re so used to 99 cents, $1.99. I mean, if I download an app from the App Store on my iPhone and it’s like 4 bucks, I’m thinking “That’s steep. That’s a lot to fork over for the app.”

So you can imagine if a small number of Android users at a small price is hard for us to support, all that has to be done with that.

So I’m not saying that it won’t happen. I’m saying we will have to really be able to see a clear business path to make it financially viable for us to put all the development in. Because right now, where we’re focusing on kiosks, we have a pretty narrow path that we can work in in JAWS. And if we want to expand that now, we’re talking about accessibility of all kinds of other apps and making sure we’re working in those apps environments, and then we’re providing support for it and updates. And so it becomes a much larger project for us to deal with. So that’s really been the hold up for us.

Jonathan: Yeah. And I suppose, the other question is, is there enough available in Android for you to truly differentiate what a JAWS for Android might do from TalkBack? Is there enough autonomy there for you?

Ryan: Yeah, there’s a technical piece to that. And it’s the same question we always ask with things like Mac, and Chrome OS, and other operating systems. Sure, maybe we could get at the basic level of textbook. Can we really do something different enough to be worth money for people to pay for it?

Jonathan: So you do look at Mac from time to time. Is that right?

Ryan: Yeah. I mean, we look at anything, right? We’re a screen reading company so if we’re not looking at other things out there, then we’re stupid. And so we’re doing that.

I went to CES in Las Vegas last week because I wanted to see what’s happening in mainstream technology around us, things that will all be coming to our desks, and shelves, and refrigerators, and garages over the next years, to see what is our place in that, and how can we work with some of these companies potentially to make screen reading available on them, and make sure that all of us who are blind or low vision can use this technology. So we have to have some innovation. We have to keep our eyes focused on what we do as a core business with JAWS, but we also have to be looking at what else is out there. What are the other ancillary things that we might and should be involved in?

Jonathan: I’ll just make one final comment about JAWS for Android, and then we’ll talk about CES because I think that’s very interesting.

I, on my iPhone, have various apps. Let me think. I’ve got FantastiCal, I’ve got Drafts which is a great composition app for techs, I’ve got Ulysses which is another word processor, Todoist (a task manager).

And I mentioned those because all of those are subscription-based apps, and I pay anything from, I don’t know, $19 or $20 US a year upwards to subscribe to those. So given the number of blind people using Android, there may well be something sustainable there if you were to go to a subscription-type model.

Ryan: Yeah, I think there possibly could be. It’s certainly the model that people are used to now.

And so it’s really a math problem, I guess, at the end of the day, and I don’t know that we’ve solved the math problem yet.

Jonathan: Right, right.

So CES is really interesting because in the past, there was just no real significant mention of disability at all at CES. It’s this wonderful geeky paradise, and a number of blind people would go there to get their geek on. But I was quite interested to see some disability-specific announcements in large number happening at CES this year.

So the mood is changing a bit, and I guess that just represents an awakening of the industry about disability in general.

Ryan: Yeah. And CES and the Consumer Technology Association really put an emphasis on disability this year. I mean, they were very openly publicizing that disability inclusion has an important place in all this. I was glad to see that.

And what I experienced, I saw a number of disability-related technologies. But also, obviously, way more just mainstream technology.

And in conversations I had with people, I would always ask, obviously, about “Have you considered accessibility in this product?”, and “How have you considered it?”, and so on and so forth.

And often, the answer I got was, “Yeah, we know accessibility. We know what it means. We know we should be doing it. We just haven’t gotten there yet.” And that could certainly lead us to discussions about how to get there.

But the theme that I saw that I think is different from maybe in the past 5, 10, 15-plus years, people didn’t even know the word “accessibility” in the past. They didn’t know what it even meant. Now, people know what it means. They know that they need to do it. They may not be doing it so we haven’t gotten there yet, but we’ve at least gotten the education piece a little bit better. That was a clear theme that I saw.

Jonathan: That segues me to seeking any comment from you on the whole question of the accessibility overlays. Because so many people say, “Yeah, I know about accessibility. We want to do something about it.”

And then, these companies come along, and they say, “If you pay us a fee, we’ll take care of accessibility for you.” And that’s specific to the web right now. Do you have a view on the value or otherwise of those accessibility overlays?

Ryan: Boy, that’s the 64 million dollar question I think that everyone’s trying to answer.

Overlay probably has a bad connotation right now. I mean, it depends on what do you really define as an overlay. We have connotations that we use with the word overlay as something that’s changing the whole way that we interact with a site. And sometimes, it makes it less accessible. There’s no question of that.

But you might often also consider other technologies as an overlay. I mean, I can see, for example, AI becoming a type of overlay, right? It would allow me to interact with a website in a totally different user experience than what was designed by the designer. So I want to be careful with the terminology, only because we could probably paint ourselves into a corner.

I think the reality is, yeah, there’s certainly been a lot of problems with the “traditional overlays” that have been out there, and the companies who have created them, and a lack of accessibility awareness.

My belief is that that’s probably changing, that those companies are becoming more aware of the community of people that this impacts. I can’t speak for them, obviously, but I see some indicators of that.

So I think the space is probably going to change over the next few years. And whether it will be good change or bad change, I really don’t know. I don’t think that I’m in it deep enough to really see what’s going to happen.

But I think the overarching idea of having personalized interfaces with information, there’s a lot of merit to that. And I think that’s actually where AI is going to take us, too. I just don’t know how that will play out.

So that’s kind of my long-winded answer to say I think there’s positive direction, more so than a few years ago. But there’s still clearly a long way to go.

Jonathan: It was a very interesting and thoughtful answer, actually. And when I think about this from that lens in which you frame it, JAWS, in a way, has led the way here because the virtual ribbons product was introduced to JAWS in response to the fact that a lot of people, at least initially, struggled with the ribbon model. They hated it, and they wanted something done about it.

So JAWS essentially re-engineered that user interface in certain Microsoft products, and some people really liked that.

Ryan: Yeah. I mean, you could certainly make an argument that JAWS could be an overlay. It can function as an overlay. It provides an alternative interface.

I’m probably stretching it, but Research IT was kind of like that. It’s a way to get some information in a totally different way.

There’s heuristics, or things that JAWS does on websites and has for years, when it comes across a form field or something and there’s not a clear label. It tries to do some guesswork based on the text around that label to try to identify what the label is. That’s kind of along the lines of an overlay.

But it’s done in a way to help people get their work done. We don’t do it because we want to go against what the standard of the website is. We do it because we would rather guess and be wrong at what a label of something is than not expose a label at all, when we know there’s not a label there.

So those are just some small examples of where we’re kind of doing that ourselves, but in a very controlled way, and obviously a highly tested way, and not near to the same response as the traditional overlay companies have been getting.

Jonathan: As we look to wrap up, the future of technology is so hard to predict. And again, another insightful thing that you’ve said, is we only need to look what? about 14, 15 months maybe, and there was no chat GPT. I mean, it’s extraordinary, isn’t it? [laughs] Ryan: It’s hard to believe.

Jonathan: Yeah, how quickly this can move.

But I wonder whether you have any thoughts on how AI might impact screen reading. For example, we talked earlier about the fact that one person’s excess verbiage is another person’s essential information. Might there be a way in time for screen readers like JAWS to learn what it is that an individual specifically wants to know about a page?

Ryan: Yeah. I think that’s exactly a great example of where AI can take us.

We had done some research in an area, even probably 3 years ago, about trying to predict how people were navigating web pages. This never got into the product. It was a research project.

But if we saw you were navigating by the tab key more, or moving by headings, or let’s say we saw that you were moving by the tab key, but we saw there were 5 headings on the page, we might bring up a prompt that said, “Do you know that you could use headings to navigate quicker through the different sections of this page?” Because we might assume maybe you don’t know that you can navigate by headings. That’s just one crude example.

But it was already in that idea of can we help predict ways to help people become more efficient in the product, and AI gives us tons of more engine or horsepower and being able to do that. So that’s an area I think we’ll certainly look at.

I mean, right now, I can easily say we’re working on how to leverage large language models to help empower people to use JAWS better.

Part of the value of JAWS is just the massive amount of things you can do with it – the enormous number of settings, the enormous number of configurations, the huge amount of training content. There’s a lot of power in that.

But there’s only power in as much as you can figure out how to use it, and that’s always been a challenge for users – to figure out how to take advantage of the settings, how to take advantage of the training and the help information that’s out there.

AI and large language models are a great tool to help people get access to that information, and that’s where we’re putting our efforts first. And I’m pretty sure that we’ll see some things actually coming out in our products. And in a number of months and year timeframe, that’ll be in those areas because we want to empower people.

And then, I think AI overall … I think what it will help do for screenreading is just help us find the right information quicker, because I mean, most of the time, if I’m doing something on the web, I’m looking for some piece of information. I’m browsing a site, trying to figure something out. I’m trying to buy something, or pay a bill of some sort, or find some information. And AI, it can be a tool to help you find what you’re looking for in a much more efficient way. And my guess is that’s where there’s going to be a lot of improvement.

I think screens are not going to go away. I’ve had people say, “Oh well, with AI, now you can just use voice to interact with everything.” And I had even people tell me at CES, “Why would you need a screen reader? You can just interact with voice.”.

It’s the same reason that I don’t ask my Soup Drinker to read my email to me, right? [laughs] Because I don’t want to sit there and use voice commands to interact. Think of how much faster I can press a keyboard command than I can talk.

Jonathan: Yeah.

Ryan: And so there’s still a need for screen reading. And there will be because there’s still a need for screens and visual information. It’s just I really think AI is going to change the way how we access that information.

Jonathan: Hopefully, more efficiently.

Ryan: I think that’s the key, and I think that’s where there’s a lot of opportunity.

Jonathan: Yeah.

Ryan: So probably about a year ago, when I started seeing what was coming with some of this stuff, I was thinking to myself, “Self, this is the best time to be in technology right now.”, because it’s about to be a crazy roller coaster ride. And that excites me, and it scares me too because you just don’t know what’s going to happen, and it’s a wild ride to hang on to. But I’m looking forward to it.

Jonathan: See, as long as you have that fascination and love for things new, you’ll be all right. I get told that eventually, mine will run out and that I will just find all this change just too much to cope with. And then, I’ll give this podcast up.

But I just think seeing the progress that we make year on year and all the exciting things happening, I get a buzz out of that. And it sounds like you do, too.

Ryan: Yeah, I do. I mean, just seeing the way that my life has changed in a year. I mean, things like Be My Eyes and the Be My AI functionality, and other image description things that are being worked on, that’s just open doors. I kind of think of it as a new frontier. There’s always this thing on the horizon of being able to access images and get descriptions of them.

And now, we’ve probably gone over that horizon. That was on the horizon for years, for many years. And it’s just all of a sudden, boom! Here we are, heading on the other side of that.

And so I’m fascinated to see, a year from now, what other things? Because I didn’t see this coming a year ago, or 15 months ago. What thing will we be talking about a year from now that we had no idea was about to be broken down for us? That’s part of what excites me.

Jonathan: For me, the interesting thing at the moment is creating images, and the fact that you can give a very detailed description of the image that you want to create and actually create something fairly plausible.

I do find verification a challenge because as a totally blind person, I have no way of being reassured that I have what I think I have. Unless, I do it in reverse and get Be My AI or something to recognise. [laughs]

Ryan: Exactly. Something to read back, yes, yes.

Jonathan: But it’s a really interesting area.

Ryan: Yeah, I agree.

Jonathan: Well, as I mentioned at the beginning, Freedom Scientific has its own media. So I am very grateful, very appreciative of the fact that you’re willing to come on here and face some tough questions. That’s a great credit to you, and I hope that we can keep in touch.

Keep up the great work,. Thank you very much for coming on the podcast. I really appreciate it.

Ryan: Absolutely. Thank you, Jonathan, and thanks, everyone.

And please, always let us know what you’re seeing, what you’re finding, the challenges that you’re having. Our job is to make JAWS, ZoomText, Fusion for you, for all of your listeners, for all of us.So please, I encourage your audience to keep in touch with us as well.

[music]

Voiceover: Stay informed about Living Blindfully by joining our announcements email list. You’ll receive a maximum of a couple of emails a week, and you’ll be the first to learn about upcoming shows and how to have your say. You can opt out anytime you want.

Join today by sending a blank email to announcements-subscribe@LivingBlindfully.com.

Why not join now? That’s announcements-subscribe@LivingBlindfully.com, and be in the know.

Level Access Buys Userway

I wanted to cover a couple of notable news items.

First, in episode 254, we spoke with Lionel Wolberger. He’s one of the co-founders of UserWay.

We had robust discussion about the future of web accessibility, whether AI really can make a tangible difference, and how we can tell what that difference actually is.

Bear with me for a second while I take you to a blog post that was posted some time back now, to be fair, from a leader in the accessibility field called Level Access. And Level Access, among other things, said this:

“When setting out to make your digital properties accessible, web accessibility overlays or accessibility widgets are one potential solution you’ll likely encounter along the way. Marketed as a quick fix, vendors promise accessibility overlays will instantly make all necessary repairs on your website and help your WCAG” (that’s the Web Content Accessibility Guidelines) “conformance problem disappear in seconds, all at a fraction of the cost of other accessibility solutions.

“It almost sounds too good to be true.”, says this blog post, “And that’s because it is.

If you’re considering using an accessibility overlay to make your website ADA compliant,” (that’s the Americans with Disabilities Act), “don’t.

It continues, “These Band-Aid solutions do not properly solve for web accessibility. They often worsen the CX/UX” (that’s customer experience and user experience), “and they will increase the likelihood of your getting sued.

The only way to improve digital accessibility is to actually do the work required to become accessible, and that will not happen overnight.”

That’s been Level Access’s position that is absolutely consistent with the general point of view of accessibility professionals throughout the world.

So you can imagine the shock, really, [laughs] when over the break, over Living Blindfully’s break, it was announced that Level Access is intending to buy Userway.

In the quote from the media release announcing this, it says:

“Allon and the Userway team have developed incredible automated remediation technologies that enable organizations to move faster in their digital accessibility programs.” This came from Tim Springer, who is the CEO and founder of Level Access.

He continues:

“This combination, with our full-service digital accessibility platform, will enable us to bring powerful new tools to our customers and positions us with a robust solution set for organizations of any size and maturity.”

“We’ve long admired the Level Access team and their integration of technology, service, and subject matter expertise,” said Allon Mason, CEO and Founder of UserWay.

“This transaction delivers compelling value to our shareholders and provides our team with a great opportunity to bring our technology to a broader market. We are unified by a shared mission to make the world more accessible, and we believe this partnership will increase and accelerate what we are able to accomplish.”

“As part of Level Access, UserWay will continue to operate under its existing name and brand.”, according to the release. “Allon Mason will continue to lead UserWay as CEO and will become President of Level Access. The transaction is expected to close in early 2024, subject to approval by UserWay’s shareholders and receipt of customary regulatory approvals.”

So, this does seem like an absolute 180 on the part of Level Access. From decrying this technology, to buying one of the leaders who are producing it.

But I continue to be open-minded. I’m eager to get evidence that this technology is evolving, and that it’s actually going to make a positive difference to our lives.

I hope that when regulatory approval is obtained and when shareholder approval is obtained, we might get Tim Springer from Level Access on the show to talk about this.

My intention in doing so is not, in any way, to play gotcha! It’s really to understand what’s brought this change of mind about.

Has the technology evolved in a way that someone like Tim thinks this is a worthwhile thing to do? Well, obviously it has, or they wouldn’t be spending $99 million making this happen.

As I said when we introduced Ryan on this episode, change is the only constant in technology. And if we conclude that this technology is garbage and that it’s harmful because of things that happened 3 or 4 years ago, and the technology has in fact, evolved and changed. And equally importantly, some of the business and marketing practices around this technology have changed so that they no longer portray blind people as hungry litigants who just want to entrap businesses, then we owe it to ourselves, I think, to take another look and to make an assessment of this technology based on where it sits today. Because many of us have been around a while, have seen technology that was rough and ready to begin with, perhaps took some sort of unfortunate path that righted itself and became valuable.

Now I’m not saying for a second that that is what has happened in this case, because I simply have no evidence.

When I go on to one of these sites that has one of these accessibility remediation tools on it, I still haven’t been able to determine one site where I feel like my user experience is better because that technology has been deployed on that site. I’m open-minded, and I’m eager to see one.

If anybody has found such a site where maybe they’re using a tool like AccessiByeBye to turn the remediation off and they think, “Wow, this is so much worse now that the remediation has been turned off.”, please let us know about this. I want to play with that site. opinion@LivingBlindfully.com, 864-60-Mosen.

But certainly, this is an extraordinary turn of events, and I do hope that we might be able to talk with somebody concerned about the rationale for the acquisition in due course. It’s a really interesting one.

A New Key Is Coming to Your Windows Keyboard

Another thing that happened over the break that’s worthy of mention is the fact that Microsoft has announced the first change to the Windows keyboard since the Windows key was introduced. Yes, some of us go back far enough, that we remember when there wasn’t a Windows key on the Windows keyboard.

Now, they have decided to introduce a co-pilot key. Microsoft is really going all in on co-pilot. Co-pilot is everywhere in all sorts of forms.

You may find that on your Windows 11 PC, you can now press Windows with C to bring up co-pilot. You’ll certainly be able to go to CoPilot.com. There are co-pilot apps for iOS and Android. You will find co-pilot in some situations in GitHub and I believe, Visual Studio. And you’ll also, if you want to, be able to subscribe to get some co-pilot features in Microsoft Office. If you have a Microsoft Office 365 subscription, then you’ll be able to pay an extra 20 US dollars a month and get some additional co-pilot features, if you want to.

So co-pilot, the whole AI thing, is really big for Microsoft. And they want to introduce a key, even though you can press Windows C right now to get into co-pilot.

Those of us who are blind who remember all these shortcut keys, we are not the norm. Many sighted people don’t know about all these shortcut keys. And Microsoft wants a key on the keyboard that says “co-pilot”, hopefully so that people press that key and try it. There’s no doubt that it’s a marketing decision.

In some implementations of this co-pilot key that have been shown on the web (and a number of tech publications have commented on this), the application key is replacing the co-pilot key.

What I found a bit concerning, to be honest, about some of the discussion I got involved in on social media once Microsoft made this announcement is it seems that not everybody understands that Shift-F10 is not a direct replacement for the application key. It does a lot of the same things that the application key does, but there are some instances where Shift-F10 does nothing, but the application key will invoke a context menu.

There are other situations where the application key can be used in conjunction with a modifier, like CTRL and SHIFT and ALT, to bring up alternative context menus. In Windows 11 where they’ve abbreviated and messed around with the context menu, if you press shift with the application key, you immediately get the full context menu back. So the application key has many important uses.

And it is also true that there are many laptops out there now that don’t have an application key. And some of us who understand how important the application key remap a key to become the application key.

On my ThinkPad, for example, on the right of the space bar, they have a print screen key. I have no need to use the print screen key, so I always use sharp keys to remap that print screen key as an application key. It’s a slightly geeky experience, but it’s not too complicated, if you get maybe some assistance the first time to do it. You can also use Microsoft PowerToys, I understand, to do the same thing.

Some people who don’t have an application key on their laptops perhaps don’t fully appreciate what it is that they’re missing by not having one. So it was a concern to me that the application key could become even more of an endangered species.

Others share this concern. Several of us reached out to Microsoft.

These are the key points that I made to Microsoft:

“The application key offers to a keyboard user what a right click offers a mouse user. It is a fundamental part of efficient computer use, and might be used many times each day. It needs to be an effortless part of the user experience.

Second, shift+F10 is far harder to press for everyone and may be impossible for keyboard users who use one hand. The keystroke requires two hands. And on some keyboards, it may be hard for some blind people to locate F10 quickly on their laptop keyboards, where the function keys are grouped without any gap.

Third, many laptop keyboards now ship out of the box with the function keys performing system functions, not their traditional screen reading functions. In some cases, a quick search with Bing, of course, will reveal an accessible way to toggle the keys back to their function keys state. But I have seen several laptops where the only way to change this is by going into the BIOS, which is not an accessible process for a blind person. Having to press FN+Shift+F10 until they find sighted help is not a good user experience. That is quite a complex keyboard combination for a function similar to the one sighted people can just get by performing a right click.”

And then, I go on to point out some of the things that I’ve already stated in this piece so I won’t repeat them.

Before I talk about the response I got from Microsoft which has been very respectful and helpful, there’s a lesson in advocacy here that I want to share because so often, people get apathetic. One person told me in a quote that really resonated with me in a bad way, “I’m sure things will work themselves out.”

And I came up with a reply that I think is worth repeating, and it’s this: “Things don’t work themselves out. People work things out.

If we have an issue that we believe might impact our productivity, our computing use, then we have an obligation to do what we can to make sure that the company concerned is aware of that issue.

We can’t assume that things will work themselves out if we don’t take the time and the trouble to let a company know that we’ve got a problem. That is very ostrich-like. And if we don’t take the time to advise a company of a concern that we have at a time when it might make a difference and something happens that we don’t like, then we’ve only got ourselves to blame. We do have a duty to speak up for ourselves.”

So I’ve got the following information from Microsoft that may or may not be helpful.

“The placement of the co-pilot key is going to vary, depending on the keyboard and the manufacturer. In general, it will be on the right side of the spacebar opposite the Windows key. In some instances, it will replace the right control key. On some keyboards that are larger, the right control key and the co-pilot key can both fit. And in other instances, the copilot key combines with the menu key.” (This is what I’ve always called the application key.) “In that last instance, the menu key becomes secondary invoked with a press of the FN key. So if you want the application key and your application key has gone because of the co-pilot key, you can press FN with co-pilot to get the context menu back.

What I don’t know about, though, is what happens on desktop keyboards where the FN key isn’t customarily available. And I will see if Microsoft might give me further information on that if they have a solution to that at the moment.

Finally, I don’t yet know whether the co-pilot key will have a key code that’s remappable. The FN key, for example, is not.

But hopefully, the co-pilot key will be remappable, and we’ll be able to use that with Sharp keys.

So any further information on this, I’ll keep you posted.

[music]

Voiceover: Like the show?

Then why not like us on Facebook?

Learn about upcoming episodes, and see when a new episode is published.

Find our Facebook page by searching for Living Blindfully, or visit us directly at facebook.com/LivingBlindfully. Living Blindfully is all one word. The URL again is facebook.com/LivingBlindfully.

The AI Microsoft Disability Answer Desk

Last year on the podcast, we featured Microsoft’s Disability Answer Desk AI implementation. You can find that on the Be My Eyes app.

Christopher Wright has tried this. He says:

“I tried the Microsoft version of Be My AI, and it gave me misleading information about sound support in Hyper-V, echoing the experience in your demo.

I’m not surprised. And hopefully, the accuracy will get better over time.

But it was simultaneously amusing and concerning. It taught me I had to enable some kind of remote FX video adapter in the virtual machine to enable sound.

I corrected it, saying this option wasn’t available.

It came back with information about enhanced session mode, which is correct, but it doesn’t enable sound without additional configuration inside the guest OS, according to my research. It failed to talk about this, or how a totally blind user would configure it without sound support in the first place.

I used OCR to get through the Windows 10 setup process, but it was once again wrong when it told me I’d get sound at the Windows out-of-box experience screen.

Despite its flaws, the technology is very interesting. It correctly told me how many times to press Shift Tab in the Windows installer to select options, and I’m pretty sure it could assist you in choosing a boot device if you took a picture of the boot menu screen on a physical computer. I’m not brave enough to have it assist me with changing BIOS or UEFI settings.

The bigger problem with AI, and the reason why I won’t trust it, is the censorship and general inaccuracy. Rather than being corrected, it is programmed to not discuss opinions or facts on certain topics such as politics. It has biased viewpoints on said topics. It refuses to recognise faces, etc. This, in my opinion, severely restricts the potential of the technology if we leave it in the hands of big corporations that can dictate how it behaves.

Sure, this may not be a problem for something like tech support, but I consider AI to be an interesting toy, not this groundbreaking thing that will change the world in massive ways people would have you believe it to be. I guess time will tell.”

Thanks for writing in, Christopher.

I was having a chat with a journalist over my summer break about this question of facial recognition. And I think there has to be a distinction drawn between describing someone’s face, which is information that a sighted person can glean by looking at the picture, versus actually identifying someone.

Now, that latter issue has significant privacy ramifications, and we need to work through that very carefully.

Obviously, for many of us, if we’re walking into a crowded room and we’re wearing a pair of glasses, as blind people, it would be super useful to be able to identify who’s in the room so that we can go and talk to someone we want to talk to, and avoid those we don’t want to talk to. That, to me, does require some sort of mutual consent.

But it’s a very different thing when certain AI technologies are blocking visual descriptions of faces. That information is available to any sighted person who cares to look at the picture, and we shouldn’t be deprived of that information. That’s discrimination in that case.

So I’m pleased that there seems to be some sort of détente that’s been arrived at between Be My AI and ChatGPT. Because when you try and upload an image to other implementations of ChatGPT, it is obscuring the face. Not with Be My AI, thank goodness, and I’m delighted by that.

But I think there is still the wider question. Why do they feel the need to obscure these faces in general implementations of the product?

Regarding the voicing of opinions by AI, I don’t think that would be a very good thing at all. Does anybody really want this? Because what opinion do you want? Are you going to have AI that basically locks you into your bubble in a post-truth society?

There are certain philosophical questions like, “What role does the state have in minimising inequality?”, for example. and you could have all sorts of theories that are espoused on both sides of that equation.

But when you get a more contentious issue, like the fact that there are still some people in the United States who think that the 2020 election was stolen, [laughs] why should AI facilitate people going down that particular conspiracy theory nonsensical rabbit warren?

And I think that this is eventually why there will be some sort of regulatory authority on artificial intelligence, because it is actually quite an important question. And leading that charge will be the EU. I’m sure there are some very interesting regulatory developments taking place, which it looks like will result in things like multiple app stores in the Apple ecosystem.

It’s thanks to the EU that we have USB-C on our iPhones. Thank you, EU.

And they are taking an interest in the whole question of artificial intelligence, and how it should be regulated. They’re not the only ones. But I think a lot of leadership will come from the EU.

Have you been trying the Be My AI implementation of Microsoft’s Disability Answer Desk? How do you find it compares with giving a good old human a call?

Tell me about your experiences, if you’d like to. 864-60-Mosen is my number in the US – 864-606-6736. You can email opinion@LivingBlindfully.com.

My Favourite Podcast App

The Castro app, it comes and goes over the time that we weren’t publishing. It went again, and the whole domain name disappeared at that point. And everybody thought, “Okay, this is really it. Castro seriously has ceased to be now.”

But then, it came back again. So it’s obviously a very unreliable app to still be using.

I do know one person, my spotty nephew, actually, who is clinging on to its use.

Matthew Whitaker is talking about his favourite podcast app, and he says:

“I’m currently using Overcast. The layout is really good, and I like the features.

I do wonder, though, when the developer will come out with versions of it for multiple platforms. Unless that has been done already.”

To the best of my knowledge, Marco has no plans to develop for other platforms. But he is completely rewriting Overcast at the moment, so I’m hopeful that some of our accessibility or feature requests will be implemented in the new version of Overcast. I look forward to seeing that.

If you want multiple platform support, Pocketcast is your best bet. It is a very good app.

But as I’ve said in the past, the reason why I can’t use it daily is you don’t get the show notes – the little preview of the episodes when you’re scrolling through, as you do in most other podcast apps. And because I consume a lot of news content, it really would slow me down to have to investigate the show notes, or play a bit of each episode. If they’d resolve that, I think the Pocketcast would be a serious contender.

Matthew also wants to know: “Are there any good podcast apps for Android?”

I’m not a regular Android user, but I have heard people praising Podcast Addict, which is a very feature-rich podcast app. And I believe it’s quite accessible.

If anybody else wants to comment on podcast apps for Android, do feel free. opinion@LivingBlindfully.com or 864-60-Mosen is podcast addictthe the rocking one, or are there other ones that are also worth considering on the Android platform?

Sonos ERA300 Versus 5

Let’s talk one of my favorite subjects, and this is Sonos.

John Dowling’s writing in. He says:

“Hi, Jonathan,

I recently became a Living Blindfully plus subscriber. It’s so cool to get those episodes way ahead of time.”

Thank you very much, John. I really appreciate the support. It means a great deal.

He says:

“I’ve thought about getting some Sonos speakers for a while now. I’ve heard really good things about the ERA300s, and I’m wondering if I should get those, or the 5s.

I mostly play Classic Country from the 50s to about the 80s, and I’ve got a Sirius XM subscription and love playing Willy’s Roadhouse and the Outlaw Country Stations. I also have Apple Music and have a lot of different playlists.

Price is a key role in this, and it looks like a pair of ERA 300s are about $900, which is not as bad as the almost $1000 price tag for the 5s.

What are your experiences with the ERA’s?

P.S. I’ll have to check out Mushroom FM’s Country Selections and see what y’all got. Hank Williams is my favourite.”

We’ve got a lot of Hank Williams there, John, and you can hear Guitars and Cadillacs, which is our all-country show on a Sunday evening, at 6 o’clock Eastern Time, and it also replays at 7am Eastern Time on a Thursday morning.

And we do specialise in music from the 50s to the 80s, but occasionally you will find the Country branches out a little bit.

So back to your primary question about Sonos.

I have heard reviewers say that in their view, the 5s sound a bit better with good old-fashioned stereo content. And if you were to get a pair of 5s and you listen to a lot of stereo material, then I’m sure you won’t be disappointed.

The ERA 300s are no slouch. They sound really good as well, with the added benefit that they’re slightly more futuristic, and that they will play Dolby Atmos content.

And although the music that you are listening to was recorded originally in stereo or even mono, quite a bit of that music is now being remixed for Dolby Atmos, and you’ll be able to play that through Sonos on Apple Music in Atmos, or Spatial Audio as it’s sometimes called. Some people think this is a bit of a gimmick.

I personally found it really immersive. What you find is if you put the ERA 300 somewhere a bit elevated and there’s a wall behind them, it really can sound like sound is bouncing off that wall. It’s quite immersive, it’s quite effective.

Additionally, the 300s have the Amazon Voice Assistant support – the good old Soup Drinker, so you can talk to that and that’s very useful. And it’s newer technology, so you may find that the ERA 300s just last a little bit longer.

If you can, go in and have a listen. See if you can get a store, a good AV store that sells Sonos products, where some of them have listening rooms, to take you in and have a pair of 300s set up and a pair of 5 set up. Listen to some of the music that you like on both pairs, and see what grabs you.

I think on balance though, for me, I would go for the 300s, just because there’s gonna be a lot more music coming out in Spatial Audio, and I think that sounds pretty cool.

If you really wanted to get a good experience, whether you buy the ERA 300s or the fives, pair it with a sub. Put that thing somewhere in the corner, and you’ll get amazing bass on top of everything else.

Once you get into the Sonos ecosystem, it’s pretty addictive. So I hope it works out all right for you.

Looking for a Laptop That Does Audio Production

Caller: Hey, Jonathan. Charlie here in South Africa.

I am looking for an affordable laptop that can help me audio produce my work. I do mostly audio, and I use Audacity and Reaper, sometimes concurrently, or sometimes one after the other.

Jonathan: Well, for basic audio recording, Charlie, you shouldn’t need anything too fancy.

Where things will start to get bogged down is if you’re doing a lot of work with plugins, where you might be using things like de-breath plugins, or audio processing plugins. Most compressors are pretty gentle on the CPU these days, but there’s no doubt there will be certain high end plugins that drag things down.

If you’re just doing basic recording, normalizing, and maybe adding a touch of compression, you should be okay with pretty much anything.

I can produce this podcast on my ThinkPad or at least, edit it. So quite often, I will record in the studio and if I want to do a bit of editing, I’ll take my ThinkPad somewhere with me, and I have the files, and I can edit just fine with my ThinkPad, which is not a cheap laptop, I readily concede.

But I think the key thing will be RAM. The more RAM you’re able to get into the laptops, the more smooth the experience is likely to be.

If you can get solid state storage, that will help you as well. But that will add some cost to it.

A fast CPU is always nice. But if we’re on a budget here, I would put that down the chain a little bit in terms of your priorities.

One thing that could make a really big difference is if you get some sort of external audio device as well. Because let’s face it, the quality of built-in Windows audio in many devices is rubbish, rubbish. [laughs] You could get something like the Audient Evo 4, or the Focusrite Scarlett or something like that. And that will really help a lot in terms of quality audio editing to just stop using the built-in sound, or maybe use that for your screen reader and have a little external audio device that you can connect to your laptop. And if you ever go to a desktop PC or someone else’s computer, you can take that with you and plug it in there as well.

Making the iPhone Action Button Do Double Duty

Caller: Hey, Jonathan! It’s Dennis Long.

I just got an iPhone 15 Pro Max.

You did a demonstration. I forget when it was. But it talked about the way you could get 2 presses or something out of the action button. I really wish Apple would make it multi-press.

But can you tell me that information again? What is it? How do you set it up? And if you could go over that, that would really be appreciated because I would love to have the action button take a picture. You know, because this will mean you don’t have to try to find the camera button on screen.

And I also want to know what other uses people have found for the action button.

Jonathan: Well, I create a bit of content on the iPhone. I consume an enormous amount of content on my iPhone. I don’t own any of the specialist players or anything like that because I like having the one device. All my stuff is there. It works super well.

And what I’ve done with the action button is I have it playing Overcast from anywhere because I listen to so many podcasts. So I can press and hold the action button and my last podcast will resume, or my next podcast will start playing.

But wait, as the infomercials say, there’s more. Because I’ve got my back double tap and back triple tap set up to also engage with overcast. So when I’m listening to a podcast, even if I’m not in overcast, I can do a back double tap and it will skip me forward 30 seconds. And if I want to hear something again, I do a back triple tap and it skips me back 30 seconds. And all of this is universal, so I’ve got this kind of global podcast set up on my iPhone 15 Pro Max.

In terms of programming the action button, the sky is the limit because there are some built in functions to which you can assign the action button, but you can also execute a shortcut. So if you can find some cool shortcuts for the action button, or you’ve already got some cool shortcuts and you want to assign one to the action button, then knock yourself out. You really can do all sorts of things with the action button.

And this is getting to the answer to your question. There have been some people who have put a variety of shortcuts together pertaining to the action button. I have seen several that will bring up a menu of choices when you press and hold the action button.

I think what you might be referring to is an article that I shared on the Living Blindfully Mastodon account where I think somebody’s got a double tap and triple tap of the action button working, but I’m not sure if that’s actually working that well.

What I would recommend, if you want to research this, is just do a search in your search engine of choice for iOS action button shortcuts and you should be able to get quite a few that you can download and try and see which one meets your needs.

But it’s a really good question that Dennis has asked here. If you’ve got one of these fancy new iPhones with an action button, to what use have you put it? Have you found that it’s really handy to have? Get in touch and share your experiences. We’d be delighted. 864-60-Mosen, or opinion@livingblindfully.com

The Bonnie Bulletin, Yoto, ThinkPads, Tablets, and Terminology

[Bonnie Bulletin music]

Jonathan: Did you, Bonnie Mosen, have one of those things where when you had a big summer break, you would come back and be invited to write an essay on what you did in your holiday?

Bonnie: I don’t remember ever doing that.

Jonathan: See, we did this every year in New Zealand.

Bonnie: I’ve heard people that did. What did you do? I mean we may have, but I just, I have no memory of it.

Jonathan: [kid’s voice] What did you do on your holiday?

[adult voice] And maybe we’d read them, or people would compare them. The teacher would highlight the highlights and things like that. It was sort of an obligatory thing that you do.

Bonnie: What did I do on my summer vacation?

Jonathan: Yes.

Bonnie: I don’t remember doing it. I mean, maybe I did, but…

Jonathan: This is your audio equivalent anyway, and it’s never too late to pick up a new habit.

So we have all sorts of things. And I know people will be interested in the technology so we should jump to Christmas, and talk about some new technology. Shall we talk about Florence’s Yoto player, first of all?

Bonnie: Yeah, yeah.

Jonathan: There’s a whole category of these devices out there for kids. I think there’s a little bit of awareness emerging that it’s not necessarily a good thing to have young children with devices that are connected to the Internet, can spy on them, and have microphones and all those sorts of things. And there are a few of these music players out there now, or media players really, because they can play stories and music.

And I chose the Yoto player (that’s spelled Y-O-T-O), after a bit of research. And the reason why I chose this one was that it allows you to upload your own content very easily, including MP3. So all of them allow you to read into the device, basically through an app on your phone. But I wanted to be able to upload content that I’d recorded here in my studio for Florence, and I’ve recorded quite a few stories, plus old stories that I have from various things, and songs and that kind of thing.

So the way the Yoto player works is that you get these little cards that actually under the hood, as they say in America, they are NFC tags. So not only can you just slot the cards into the Yoto player any way (They go in any direction which is great for youngsters.), but you can also wave the NFC tag at the phone when the Yoto app is running, and it will play the content as well.

It downloads the content the first time you put the card in the slot, so it needs to be connected to the internet for that to happen. But once it’s downloaded onto the built-in memory of the Yoto player, then it’s there. And when kids take it on road trips, they don’t need to be connected to the internet.

You can also have streams set up with it. It has some content that does stream over the internet. But it’s a very specific device that does specific things.

Florence can pull the cards out. She’s only just turned 1. She had her first birthday on the 8. [laughs] But she can pull the cards out. And then, she gets upset that the music stopped.

Bonnie: She hasn’t quite made the distinction to put them back in.

Jonathan: That’s right. [laughs]

Bonnie: And it’s very durable.

Jonathan: Yeah.

Bonnie: Because kids throw things and have tantrums, and that sort of thing. So it’s a very durable thing, and it can stand drops and spills, and all the things that happen when you’re little, which is good.

It’s not like a 20,000 or $2,000 iPad or something. [laughs]

Jonathan: No. Her parents use it for bath time a lot, apparently, where Florence can’t keep pulling out the Yoto cards. And I’m sure she’ll get better at it as she gets a bit older.

I’ve uploaded quite a bit of content. We bought some content. There’s a bit of Beatles stuff in the Yoto categories as well.

Setup was pretty easy. But one area where I did need sighted assistance is you had to enter some sort of I think serial number or code displayed on the device when you first connected to a Yoto account. You might be able to do that with, I don’t know, Be My AI or something, seeing AI, I’m not sure. But I had a pair of working eyeballs available, so we did it that way.

But apart from that, setup is pretty straightforward. The app’s okay, and we’re up and running.

Bonnie: It has a screen on it that shows some illustrations of the stories and stuff.

Jonathan: Yeah, it’s a cool player.

It seems to be quite popular, too. When I mentioned this on Mastodon, several people said “We’ve got Yoto players for our kids as well.”

Bonnie: And it’s good people are looking at that, because I am completely against iPad things for small children. I just don’t think they’re necessary.

And they’re dangerous. I mean, there can be some danger with it.

Jonathan: I mean, I have a slightly different view on this. I think that there are a lot of very good learning tools on tablets these days that help young kids with reading comprehension and numeracy.

What I think is the danger though is that some parents get tablets, and they have Youtube and movies on it.

Bonnie: Yeah.

Jonathan: It’s kind of like a babysitter, rather than an education TV.

Bonnie: There’s some studies that children are not getting the interaction with people, you know, learning from their parents teaching them the little ABC songs. It’s more connected with a device than actually a human interaction.

Jonathan: But doom has been coming to the human race for several generations now. I mean I’m sure, people thought that about radio. People certainly thought that about television. And now, they’re thinking about this in the context of tablets and things.

So I don’t know. I think we have to find a balance between reality and curmudgeonness.

Bonnie: Yeah. And well, I was very fortunate that my parents interacted with me. You know, they played with me, I had interactive toys. I did watch TV, but there was a lot of interaction with other children and going to candy, and play school, and that sort of thing. And some of that, you’re not seeing as much now. And they have the playgroups and stuff like that.

But I see a lot of parents that just don’t interact with their kids like, “Okay, here’s the tablet. Shut up.”

Jonathan: Yeah, I agree. That is a concern.

Bonnie: Yeah.

Jonathan: If it’s done right, if you as a parent or a grandparent supervise the child with an app that offers specific numeracy, or literacy, or some sort of just a game or something that’s controlled, I think that’s quite different from basically sitting there and stupefying the child with entertainment all the time.

Bonnie: Mmm-hmm. YouTube and TikTok.

Jonathan: [laughs] Yeah.

But you also got a fair dose of technology.

Bonnie: I did. I got a ThinkPad, which I’m very excited about.

Jonathan: This is the X1 Carbon.

Bonnie: X1 Carbon ThinkPad.

Jonathan: Yup.

Bonnie: So super excited about that. It’s very light, and thin, and I keep mistaking it for things because it has a different lid than yours does.

Jonathan: Yes.

Bonnie: So it lays on my desk, and it kind of feels like the desk, and I was like, “Where’s my ThinkPad? Oh, there it is.”

Jonathan: Yeah, the texture’s different.

Bonnie: Texture’s very different.

Jonathan: Yeah.

Bonnie: But it’s great. So I’ve had it about almost quite a month yet?

Jonathan: Yes, I told you on Christmas Day that it was coming through a Suno AI tune. It arrived a bit late because we had it custom built at the Lenovo factory to our specs, and I’m really pleased with it.

Setting it up was pretty straightforward, and we’ve come a long way in that regard, actually, in terms of being able to use Narrator to set something up from scratch.

One thing I really do like about ThinkPads, and this will appeal to the geeks, is that there is an accessible BIOS tool which is almost unheard of. So you can run this ThinkBIOS tool which you can download from the Lenovo website, and get right deep into the BIOS. And the reason why I wanted to do that was to swap the control and function key positions.

Bonnie: Yeah.

Jonathan: In the very latest ThinkPad, which this one is not, (There’s a new generation just available in the US at the moment, hasn’t made it to the New Zealand store yet.), they have actually repositioned the function and control key themselves now finally, to be more consistent with other laptops. But going into the BIOS and being able to do those things is really sweet. So that’s a big plus for Lenovo.

We did our New Year’s radio show again, and that was a lot of fun. And it was a special one this time because it was actually my 25th anniversary of live internet radio broadcasting, and we got some lovely emails and audio messages. It was kind of like attending one’s funeral, without actually having to go to the bother of dying.

[laughter]

Jonathan: We’ve had a lot of good quality time over the break, just recharging.

Since I’ve talked about it on this podcast, what is your opinion about these organizations that are increasingly using words like vision and sight in their names when they’re providing services to blind and low vision people, but they don’t mention blind at all?

Bonnie: I think they need to stick with blind and low vision. I mean, that’s what it is. It’s blind people. And I know that there are people in the low vision sphere that think they don’t need services because they’re confused. They’re not blind.

Jonathan: Right. Yes.

Bonnie: And they don’t understand that, …

I’ll just pick on Mass Commission. Mass Commission for the Blind doesn’t serve, and that’s why they added low vision onto their name. I have no problem.

Jonathan: Have they added it, actually?

Bonnie: Yeah, they added it, I think when I was there, maybe a little before I was there.

Jonathan: Okay, because you still hear about it in MCB, don’t you?

Bonnie: Yeah. I mean, that’s because nothing changes a name. [laughs]

Jonathan: So it’s not actually MCB anymore, is it?

Bonnie: I think it’s Massachusetts Commission for the Blind and Low Vision, I believe.

Jonathan: Ah, okay.

Bonnie: So maybe if anyone listening out there can correct me on that, …

Jonathan: I understand that because as I said in the intro on this podcast, you do have to meet people where they are.

Bonnie: Yeah.

Jonathan: And if people really do feel alienated because of where they are in there, and usually it is a vision loss journey in this case.

Sight loss is another popular term, because people seem so afraid to use the blind word.

Bonnie: Yeah, blind. I don’t like sight loss cuz a lot of fundraising things I’ve seen, it completely talks about some people never had sight to lose.

Jonathan: That’s right.

Bonnie: So that’s alienating them. I think you need to stick with blind and low vision.

Jonathan: Yeah.

Bonnie: That’s fine, that includes everybody. But sight loss, sight saving classes, sight whatever, I don’t think you need to. No, it’s blind and low vision.

Jonathan: Oh well, we’re not gonna have a robust discussion on this one then, ’cause we’re in agreement.

Bonnie: Yeah, it’s just, it’s ridiculous, and I think people need to use … There’s nothing wrong with saying blind.

Jonathan: No.

Bonnie: Some people dance around it, like you’re sightless, or unsighted.

Jonathan: Visually challenged.

Bonnie: Someone had one the other day. They said they were seeing-impaired. They were hard of seeing.

Jonathan: Hard of seeing? Yeah, I’ve seen that one as well, yeah.

Bonnie: I’ve not seen that one. They said well, I can’t see at all, so I’m not hard of seeing.

Jonathan: [laughs] That’s right, yeah.

Bonnie: And I wonder with deafness. I mean, hard of hearing sounds a bit strange. I’ve always thought that was a funny term.

Jonathan: I have seen that a lot – hard of hearing.

Bonnie: Yeah, hard of hearing. So hearing impaired? To me, that sounds better than hard of hearing.

Jonathan: Well, I always like to look at the benefits of things, you know, keeping the gratitude journal and everything. And I think one of the benefits of the hearing impairment that I have, which has been degenerative, is I have developed, I think, greater appreciation of the challenges that low vision people do have, in terms of identifying themselves.

Bonnie: Yeah.

Jonathan: Because somebody told me I’m eligible to identify myself as a deaf-blind person under the definition of deaf-blind. And I don’t mind it. I don’t particularly think there’s a stigma about it. It’s not something I’m ashamed of. But there is a little bit of me that says, “Am I really entitled to call myself that, when there are people who, for example, can’t communicate without fingerspelling and Braille, and I’m very much still able to do audio?

Bonnie: Yeah.

Jonathan: So am I being a bit fraudulent by using the term deaf-blind to describe my particular situation?

So I get where a lot of low vision people come from, but the answer is not to erase blindness from the planet.

Bonnie: No, I mean some blind people don’t have vision. They never will have vision. And I think you need to include both groups.

Jonathan: There you go.

Bonnie: ’Cause low vision, they’re stuck, a lot of them are stuck between sighted and blind because they’ve never been told they can use their vision the way they need to. They’ve always been forced to kind of be in that side of camp, a lot of them. I’ve known a lot of low vision kids, particularly, that are older now that if they had had better, if you will, blindness training and learned how to use alternate techniques even with low vision, it would have saved them a lot of psychological, physical, mental stress.

Jonathan: As they say on the BBC, Bonnie Mosen, thank you.

Bonnie: Thank you.

[music]

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions dot com.

Closing and contact info

That’s it for our first episode of 2024.

I’m looking forward to being back with you next week.

Remember. When you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.

[music]

Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.

If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: opinion@LivingBlindfully.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.

[music]